main content

bayesian optimization results -凯发k8网页登录

bayesian optimization results

description

a bayesianoptimization object contains the results of a bayesian optimization. it is the output of or a fit function that accepts the optimizehyperparameters name-value pair such as fitcdiscr. in addition, a bayesianoptimization object contains data for each iteration of bayesopt that can be accessed by a plot function or an output function.

creation

create a bayesianoptimization object by using the function or one of the following fit functions with the optimizehyperparameters name-value argument.

properties

problem definition properties

this property is read-only.

objectivefcn argument used by bayesopt, specified as a function handle.

  • if you call bayesopt directly, objectivefcn is the bayesopt objective function argument.

  • if you call a fit function containing the 'optimizehyperparameters' name-value pair argument, objectivefcn is a function handle that returns the misclassification rate for classification or returns the logarithm of one plus the cross-validation loss for regression, measured by five-fold cross-validation.

data types: function_handle

this property is read-only.

variabledescriptions argument that bayesopt used, specified as a vector of objects.

  • if you called bayesopt directly, variabledescriptions is the bayesopt variable description argument.

  • if you called a fit function with the optimizehyperparameters name-value pair, variabledescriptions is the vector of hyperparameters.

this property is read-only.

options that bayesopt used, specified as a structure.

  • if you called bayesopt directly, options is the options used in bayesopt, which are the name-value pairs see bayesopt .

  • if you called a fit function with the optimizehyperparameters name-value pair, options are the default bayesopt options, modified by the hyperparameteroptimizationoptions name-value pair.

options is a read-only structure containing the following fields.

option namemeaning
acquisitionfunctionnameacquisition function name. see acquisition function types.
isobjectivedeterministictrue means the objective function is deterministic, false otherwise.
explorationratioused only when acquisitionfunctionname is 'expected-improvement-plus' or 'expected-improvement-per-second-plus'. see plus.
  
maxobjectiveevaluationsobjective function evaluation limit.
maxtimetime limit.
  
xconstraintfcndeterministic constraints on variables. see .
conditionalvariablefcnconditional variable constraints. see .
numcoupledconstraintsnumber of coupled constraints. see .
coupledconstrainttolerancescoupled constraint tolerances. see .
arecoupledconstraintsdeterministiclogical vector specifying whether each coupled constraint is deterministic.
  
verbosecommand-line display level.
outputfcnfunction called after each iteration. see .
savevariablenamevariable name for the @assigninbase output function.
savefilenamefile name for the @savetofile output function.
plotfcnplot function called after each iteration. see
  
initialxpoints where bayesopt evaluated the objective function.
initialobjectiveobjective function values at initialx.
initialconstraintviolationscoupled constraint function values at initialx.
initialerrorvalueserror values at initialx.
initialobjectiveevaluationtimesobjective function evaluation times at initialx.
initialiterationtimestime for each iteration, including objective function evaluation and other computations.

data types: struct

solution properties

this property is read-only.

minimum observed value of objective function, specified as a real scalar. when there are coupled constraints or evaluation errors, this value is the minimum over all observed points that are feasible according to the final constraint and error models.

data types: double

this property is read-only.

observed point with minimum objective function value, specified as a 1-by-d table, where d is the number of variables.

data types: table

this property is read-only.

estimated objective function value at xatminestimatedobjective, specified as a real scalar.

minestimatedobjective is the mean value of the posterior distribution of the final objective model. the software estimates the minestimatedobjective value by passing xatminestimatedobjective to the object function .

data types: double

this property is read-only.

point with the minimum upper confidence bound of the objective function value among the visited points, specified as a 1-by-d table, where d is the number of variables. the software uses the final objective model to find the upper confidence bounds of the visited points.

xatminestimatedobjective is the same as the best point returned by the bestpoint function with the default criterion ('min-visited-upper-confidence-interval').

data types: table

this property is read-only.

number of objective function evaluations, specified as a positive integer. this includes the initial evaluations to form a posterior model as well as evaluation during the optimization iterations.

data types: double

this property is read-only.

total elapsed time of optimization in seconds, specified as a positive scalar.

data types: double

this property is read-only.

next point to evaluate if optimization continues, specified as a 1-by-d table, where d is the number of variables.

data types: table

trace properties

this property is read-only.

points where the objective function was evaluated, specified as a t-by-d table, where t is the number of evaluation points and d is the number of variables.

data types: table

this property is read-only.

objective function values, specified as a column vector of length t, where t is the number of evaluation points. objectivetrace contains the history of objective function evaluations.

data types: double

this property is read-only.

objective function evaluation times, specified as a column vector of length t, where t is the number of evaluation points. objectiveevaluationtimetrace includes the time in evaluating coupled constraints, because the objective function computes these constraints.

data types: double

this property is read-only.

iteration times, specified as a column vector of length t, where t is the number of evaluation points. iterationtimetrace includes both objective function evaluation time and other overhead.

data types: double

this property is read-only.

coupled constraint values, specified as a t-by-k array, where t is the number of evaluation points and k is the number of coupled constraints.

data types: double

this property is read-only.

error indications, specified as a column vector of length t of -1 or 1 entries, where t is the number of evaluation points. each 1 entry indicates that the objective function errored or returned nan on the corresponding point in xtrace. each -1 entry indicates that the objective function value was computed.

data types: double

this property is read-only.

feasibility indications, specified as a logical column vector of length t, where t is the number of evaluation points. each 1 entry indicates that the final constraint model predicts feasibility at the corresponding point in xtrace.

data types: logical

this property is read-only.

probability that evaluation point is feasible, specified as a column vector of length t, where t is the number of evaluation points. the probabilities come from the final constraint model, including the error constraint model, on the corresponding points in xtrace.

data types: double

this property is read-only.

which evaluation gave minimum feasible objective, specified as a column vector of integer indices of length t, where t is the number of evaluation points. feasibility is determined with respect to the constraint models that existed at each iteration, including the error constraint model.

data types: double

this property is read-only.

minimum observed objective, specified as a column vector of length t, where t is the number of evaluation points.

data types: double

this property is read-only.

estimated objective, specified as a column vector of length t, where t is the number of evaluation points. the estimated objective at each iteration is determined with respect to the objective model at that iteration. at each iteration, the software uses the object function to estimate the objective function value at the point with the minimum upper confidence bound of the objective function among the visited points.

data types: double

this property is read-only.

auxiliary data from the objective function, specified as a cell array of length t, where t is the number of evaluation points. each entry in the cell array is the userdata returned in the third output of the objective function.

data types: cell

object functions

bestpointbest point in a bayesian optimization according to a criterion
plot bayesian optimization results
predict coupled constraint violations at a set of points
predict error value at a set of points
predict objective function at a set of points
predict objective function run times at a set of points
resume a bayesian optimization

examples

this example shows how to create a bayesianoptimization object by using bayesopt to minimize cross-validation loss.

optimize hyperparameters of a knn classifier for the ionosphere data, that is, find knn hyperparameters that minimize the cross-validation loss. have bayesopt minimize over the following hyperparameters:

  • nearest-neighborhood sizes from 1 to 30

  • distance functions 'chebychev', 'euclidean', and 'minkowski'.

for reproducibility, set the random seed, set the partition, and set the acquisitionfunctionname option to 'expected-improvement-plus'. to suppress iterative display, set 'verbose' to 0. pass the partition c and fitting data x and y to the objective function fun by creating fun as an anonymous function that incorporates this data. see .

load ionosphere
rng default
num = optimizablevariable('n',[1,30],'type','integer');
dst = optimizablevariable('dst',{'chebychev','euclidean','minkowski'},'type','categorical');
c = cvpartition(351,'kfold',5);
fun = @(x)kfoldloss(fitcknn(x,y,'cvpartition',c,'numneighbors',x.n,...
    'distance',char(x.dst),'nsmethod','exhaustive'));
results = bayesopt(fun,[num,dst],'verbose',0,...
    'acquisitionfunctionname','expected-improvement-plus')

figure contains an axes object. the axes object with title objective function model, xlabel n, ylabel dst contains 5 objects of type line, surface, contour. one or more of the lines displays its values using only markers these objects represent observed points, model mean, next point, model minimum feasible.

figure contains an axes object. the axes object with title min objective vs. number of function evaluations, xlabel function evaluations, ylabel min objective contains 2 objects of type line. these objects represent min observed objective, estimated min objective.

results = 
  bayesianoptimization with properties:
                      objectivefcn: @(x)kfoldloss(fitcknn(x,y,'cvpartition',c,'numneighbors',x.n,'distance',char(x.dst),'nsmethod','exhaustive'))
              variabledescriptions: [1x2 optimizablevariable]
                           options: [1x1 struct]
                      minobjective: 0.1197
                   xatminobjective: [1x2 table]
             minestimatedobjective: 0.1213
          xatminestimatedobjective: [1x2 table]
           numobjectiveevaluations: 30
                  totalelapsedtime: 53.9077
                         nextpoint: [1x2 table]
                            xtrace: [30x2 table]
                    objectivetrace: [30x1 double]
                  constraintstrace: []
                     userdatatrace: {30x1 cell}
      objectiveevaluationtimetrace: [30x1 double]
                iterationtimetrace: [30x1 double]
                        errortrace: [30x1 double]
                  feasibilitytrace: [30x1 logical]
       feasibilityprobabilitytrace: [30x1 double]
               indexofminimumtrace: [30x1 double]
             objectiveminimumtrace: [30x1 double]
    estimatedobjectiveminimumtrace: [30x1 double]

this example shows how to minimize the cross-validation loss in the ionosphere data using bayesian optimization of an svm classifier.

load the data.

load ionosphere

optimize the classification using the 'auto' parameters.

rng default % for reproducibility
mdl = fitcsvm(x,y,'optimizehyperparameters','auto')
|=====================================================================================================|
| iter | eval   | objective   | objective   | bestsofar   | bestsofar   | boxconstraint|  kernelscale |
|      | result |             | runtime     | (observed)  | (estim.)    |              |              |
|=====================================================================================================|
|    1 | best   |     0.25926 |      17.693 |     0.25926 |     0.25926 |       64.836 |    0.0015729 |
|    2 | accept |     0.35897 |     0.12495 |     0.25926 |     0.26547 |     0.036335 |       5.5755 |
|    3 | best   |     0.13105 |      6.7098 |     0.13105 |     0.14588 |    0.0022147 |    0.0023957 |
|    4 | accept |     0.35897 |     0.25756 |     0.13105 |     0.13108 |       5.1259 |        98.62 |
|    5 | best   |     0.12251 |     0.30018 |     0.12251 |     0.12253 |    0.0010264 |     0.042908 |
|    6 | accept |      0.1396 |     0.30734 |     0.12251 |     0.12253 |     0.021383 |     0.037148 |
|    7 | accept |     0.12821 |     0.47473 |     0.12251 |     0.12472 |     0.001017 |     0.013853 |
|    8 | accept |     0.12536 |     0.20292 |     0.12251 |     0.12278 |    0.0010632 |     0.029785 |
|    9 | accept |      0.1339 |     0.24642 |     0.12251 |     0.12411 |    0.0010856 |     0.076868 |
|   10 | accept |     0.12821 |     0.21524 |     0.12251 |     0.12537 |     0.001008 |     0.031877 |
|   11 | accept |     0.12251 |     0.23305 |     0.12251 |     0.12502 |    0.0022473 |     0.027232 |
|   12 | accept |     0.12821 |     0.27149 |     0.12251 |     0.12491 |    0.0034295 |     0.023161 |
|   13 | accept |     0.12821 |     0.18529 |     0.12251 |     0.12567 |    0.0010116 |     0.029489 |
|   14 | best   |     0.11681 |     0.17675 |     0.11681 |     0.12349 |    0.0017917 |      0.03261 |
|   15 | accept |     0.12536 |     0.21292 |     0.11681 |      0.1239 |    0.0024766 |     0.035709 |
|   16 | accept |      0.1339 |     0.91022 |     0.11681 |     0.12346 |        999.6 |       5.3437 |
|   17 | accept |     0.13105 |       14.54 |     0.11681 |     0.12358 |       996.49 |      0.33637 |
|   18 | accept |      0.1396 |     0.19529 |     0.11681 |     0.12356 |       966.54 |       130.04 |
|   19 | accept |     0.35897 |     0.43483 |     0.11681 |     0.12267 |        992.2 |       895.91 |
|   20 | accept |     0.12821 |     0.19052 |     0.11681 |     0.12261 |       999.77 |       40.083 |
|=====================================================================================================|
| iter | eval   | objective   | objective   | bestsofar   | bestsofar   | boxconstraint|  kernelscale |
|      | result |             | runtime     | (observed)  | (estim.)    |              |              |
|=====================================================================================================|
|   21 | accept |     0.13105 |     0.26461 |     0.11681 |     0.12274 |       995.79 |       74.272 |
|   22 | accept |     0.12821 |     0.26234 |     0.11681 |     0.12353 |    0.0027724 |     0.047693 |
|   23 | accept |     0.13675 |      12.401 |     0.11681 |     0.12363 |       3.4174 |     0.038804 |
|   24 | accept |     0.12536 |     0.26678 |     0.11681 |     0.12364 |       994.97 |       17.307 |
|   25 | accept |     0.35897 |     0.16473 |     0.11681 |     0.12355 |    0.0010088 |       15.344 |
|   26 | accept |     0.13675 |      1.2304 |     0.11681 |     0.12361 |    0.0091145 |     0.010512 |
|   27 | accept |     0.35897 |     0.42997 |     0.11681 |     0.12351 |    0.0010009 |       2.5587 |
|   28 | accept |     0.16239 |     0.25135 |     0.11681 |     0.11719 |       1.7135 |       13.156 |
|   29 | accept |     0.12536 |     0.66233 |     0.11681 |     0.11723 |       264.54 |       25.024 |
|   30 | accept |      0.1339 |     0.17091 |     0.11681 |     0.11722 |       1.6366 |       3.1961 |

figure contains an axes object. the axes object with title min objective vs. number of function evaluations, xlabel function evaluations, ylabel min objective contains 2 objects of type line. these objects represent min observed objective, estimated min objective.

figure contains an axes object. the axes object with title objective function model, xlabel boxconstraint, ylabel kernelscale contains 5 objects of type line, surface, contour. one or more of the lines displays its values using only markers these objects represent observed points, model mean, next point, model minimum feasible.

__________________________________________________________
optimization completed.
maxobjectiveevaluations of 30 reached.
total function evaluations: 30
total elapsed time: 99.0638 seconds
total objective function evaluation time: 59.9873
best observed feasible point:
    boxconstraint    kernelscale
    _____________    ___________
      0.0017917        0.03261  
observed objective function value = 0.11681
estimated objective function value = 0.11722
function evaluation time = 0.17675
best estimated feasible point (according to models):
    boxconstraint    kernelscale
    _____________    ___________
      0.0017917        0.03261  
estimated objective function value = 0.11722
estimated function evaluation time = 0.21613
mdl = 
  classificationsvm
                         responsename: 'y'
                categoricalpredictors: []
                           classnames: {'b'  'g'}
                       scoretransform: 'none'
                      numobservations: 351
    hyperparameteroptimizationresults: [1x1 bayesianoptimization]
                                alpha: [100x1 double]
                                 bias: -4.7046
                     kernelparameters: [1x1 struct]
                       boxconstraints: [351x1 double]
                      convergenceinfo: [1x1 struct]
                      issupportvector: [351x1 logical]
                               solver: 'smo'
  properties, methods

the fit achieved about 12% loss for the default 5-fold cross validation.

examine the bayesianoptimization object that is returned in the hyperparameteroptimizationresults property of the returned model.

disp(mdl.hyperparameteroptimizationresults)
  bayesianoptimization with properties:
                      objectivefcn: @createobjfcn/inmemoryobjfcn
              variabledescriptions: [5x1 optimizablevariable]
                           options: [1x1 struct]
                      minobjective: 0.1168
                   xatminobjective: [1x2 table]
             minestimatedobjective: 0.1172
          xatminestimatedobjective: [1x2 table]
           numobjectiveevaluations: 30
                  totalelapsedtime: 99.0638
                         nextpoint: [1x2 table]
                            xtrace: [30x2 table]
                    objectivetrace: [30x1 double]
                  constraintstrace: []
                     userdatatrace: {30x1 cell}
      objectiveevaluationtimetrace: [30x1 double]
                iterationtimetrace: [30x1 double]
                        errortrace: [30x1 double]
                  feasibilitytrace: [30x1 logical]
       feasibilityprobabilitytrace: [30x1 double]
               indexofminimumtrace: [30x1 double]
             objectiveminimumtrace: [30x1 double]
    estimatedobjectiveminimumtrace: [30x1 double]

version history

introduced in r2016b

网站地图