global optimization toolbox solver characteristics -凯发k8网页登录

main content

global optimization toolbox solver characteristics

solver choices

this section describes global optimization toolbox solver characteristics. the section includes recommendations for obtaining results more effectively.

to achieve better or faster solutions, first try tuning the by setting appropriate options or bounds. if the results are unsatisfactory, try other solvers.

desired solutionsmooth objective and constraintsnonsmooth objective or constraints
explanation of “desired solution”choosing between solvers for smooth problemschoosing between solvers for nonsmooth problems
single local solutionoptimization toolbox™ functions; see optimization decision tablefminbnd, patternsearch, fminsearch, ga, particleswarm, simulannealbnd, surrogateopt
multiple local solutionsglobalsearch, multistartpatternsearch, ga, particleswarm, simulannealbnd, or surrogateopt started from multiple initial points x0 or from multiple initial populations
single global solutionglobalsearch, multistart, patternsearch, particleswarm, ga, simulannealbnd, surrogateoptpatternsearch, ga, particleswarm, simulannealbnd, surrogateopt
single local solution using parallel processingmultistart, optimization toolbox functionspatternsearch, ga, particleswarm, surrogateopt
multiple local solutions using parallel processingmultistartpatternsearch, ga, or particleswarm started from multiple initial points x0 or from multiple initial populations
single global solution using parallel processingmultistartpatternsearch, ga, particleswarm, surrogateopt

explanation of “desired solution”

to understand the meaning of the terms in “desired solution,” consider the example

f(x)=100x2(1–x)2x,

which has local minima x1 near 0 and x2 near 1:

 

the minima are located at:

fun = @(x)(100*x^2*(x - 1)^2 - x);
x1 = fminbnd(fun,-0.1,0.1)
x1 =
    0.0051
x2 = fminbnd(fun,0.9,1.1)
x2 =
    1.0049

description of the terms

termmeaning
single local solutionfind one local solution, a point x where the objective function f(x) is a local minimum. for more details, see . in the example, both x1 and x2 are local solutions.
multiple local solutionsfind a set of local solutions. in the example, the complete set of local solutions is {x1,x2}.
single global solutionfind the point x where the objective function f(x) is a global minimum. in the example, the global solution is x2.

choosing between solvers for smooth problems

single global solution

  1. try globalsearch first. it is most focused on finding a global solution, and has an efficient local solver, fmincon.

  2. try multistart next. it has efficient local solvers, and can search a wide variety of start points.

  3. try patternsearch next. it is less efficient, since it does not use gradients. however, patternsearch is robust and is more efficient than the remaining local solvers to search for a global solution, start patternsearch from a variety of start points.

  4. try surrogateopt next. surrogateopt attempts to find a global solution using the fewest objective function evaluations. surrogateopt has more overhead per function evaluation than most other solvers. surrogateopt requires finite bounds, and accepts integer constraints, linear constraints, and nonlinear inequality constraints.

  5. try particleswarm next, if your problem is unconstrained or has only bound constraints. usually, particleswarm is more efficient than the remaining solvers, and can be more efficient than patternsearch.

  6. try ga next. it can handle all types of constraints, and is usually more efficient than simulannealbnd.

  7. try simulannealbnd last. it can handle problems with no constraints or bound constraints. simulannealbnd is usually the least efficient solver. however, given a slow enough cooling schedule, it can find a global solution.

multiple local solutions

globalsearch and multistart both provide multiple local solutions. for the syntax to obtain multiple solutions, see . globalsearch and multistart differ in the following characteristics:

  • multistart can find more local minima. this is because globalsearch rejects many generated start points (initial points for local solution). essentially, globalsearch accepts a start point only when it determines that the point has a good chance of obtaining a global minimum. in contrast, multistart passes all generated start points to a local solver. for more information, see .

  • multistart offers a choice of local solver: fmincon, fminunc, lsqcurvefit, or lsqnonlin. the globalsearch solver uses only fmincon as its local solver.

  • globalsearch uses a scatter-search algorithm for generating start points. in contrast, multistart generates points uniformly at random within bounds, or allows you to provide your own points.

  • multistart can run in parallel. see .

choosing between solvers for nonsmooth problems

choose the applicable solver with the lowest number. for problems with integer constraints, use .

  1. use fminbnd first on one-dimensional bounded problems only. fminbnd provably converges quickly in one dimension.

  2. use patternsearch on any other type of problem. patternsearch provably converges, and handles all types of constraints.

  3. try surrogateopt for problems that have time-consuming objective functions. surrogateopt searches for a global solution. surrogateopt requires finite bounds, and accepts integer constraints, linear constraints, and nonlinear inequality constraints.

  4. try fminsearch next for low-dimensional unbounded problems. fminsearch is not as general as patternsearch and can fail to converge. for low-dimensional problems, fminsearch is simple to use, since it has few tuning options.

  5. try particleswarm next on unbounded or bound-constrained problems. particleswarm has little supporting theory, but is often an efficient algorithm.

  6. try ga next. ga has little supporting theory and is often less efficient than patternsearch or particleswarm. ga handles all types of constraints. ga and surrogateopt are the only global optimization toolbox solvers that accept integer constraints.

  7. try simulannealbnd last for unbounded problems, or for problems with bounds. simulannealbnd provably converges only for a logarithmic cooling schedule, which is extremely slow. simulannealbnd takes only bound constraints, and is often less efficient than ga.

solver characteristics

solverconvergencecharacteristics
globalsearchfast convergence to local optima for smooth problemsdeterministic iterates
gradient-based
automatic stochastic start points
removes many start points heuristically
multistartfast convergence to local optima for smooth problemsdeterministic iterates
can run in parallel; see
gradient-based
stochastic or deterministic start points, or combination of both
automatic stochastic start points
runs all start points
choice of local solver: fmincon, fminunc, lsqcurvefit, or lsqnonlin
patternsearchproven convergence to local optimum; slower than gradient-based solversdeterministic iterates
can run in parallel; see
no gradients
user-supplied start point
surrogateoptproven convergence to global optimum for bounded problems; slower than gradient-based solvers; generally stops by reaching a function evaluation limit or other limitstochastic iterates
can run in parallel; see
best used for time-consuming objective functions
requires bound constraints, accepts linear constraints and nonlinear inequality constraints
allows integer constraints; see
no gradients
automatic start points or user-supplied points, or a combination of both
particleswarmno convergence proofstochastic iterates
can run in parallel; see
population-based
no gradients
automatic start population or user-supplied population, or a combination of both
only bound constraints
gano convergence proofstochastic iterates
can run in parallel; see
population-based
no gradients
allows integer constraints; see
automatic start population or user-supplied population, or a combination of both
simulannealbndproven to converge to global optimum for bounded problems with very slow cooling schedulestochastic iterates
no gradients
user-supplied start point
only bound constraints

explanation of some characteristics:

  • convergence — solvers can fail to converge to any solution when started far from a local minimum. when started near a local minimum, gradient-based solvers converge to a local minimum quickly for smooth problems. patternsearch provably converges for a wide range of problems, but the convergence is slower than gradient-based solvers. both ga and simulannealbnd can fail to converge in a reasonable amount of time for some problems, although they are often effective.

  • iterates — solvers iterate to find solutions. the steps in the iteration are iterates. some solvers have deterministic iterates. others use random numbers and have stochastic iterates.

  • gradients — some solvers use estimated or user-supplied derivatives in calculating the iterates. other solvers do not use or estimate derivatives, but use only objective and constraint function values.

  • start points — most solvers require you to provide a starting point for the optimization in order to obtain the dimension of the decision variables. ga and surrogateopt do not require any starting points, because they take the dimension of the decision variables as an input or infer dimensions from bounds. these solvers generate a start point or population automatically, or they accept a point or points that you supply.

compare the characteristics of global optimization toolbox solvers to optimization toolbox solvers.

solverconvergencecharacteristics
fmincon, fminunc, fseminf, lsqcurvefit, lsqnonlinproven quadratic convergence to local optima for smooth problemsdeterministic iterates
gradient-based
user-supplied starting point
fminsearchno convergence proof — counterexamples exist.deterministic iterates
no gradients
user-supplied start point
no constraints
fminbndproven convergence to local optima for smooth problems, slower than quadratic.deterministic iterates
no gradients
user-supplied start interval
only one-dimensional problems

all these optimization toolbox solvers:

  • have deterministic iterates

  • require a start point or interval

  • search just one basin of attraction

why are some solvers objects?

globalsearch and multistart are objects. what does this mean for you?

  • you create a globalsearch or multistart object before running your problem.

  • you can reuse the object for running multiple problems.

  • globalsearch and multistart objects are containers for algorithms and global options. you use these objects to run a local solver multiple times. the local solver has its own options.

for more information, see the documentation.

related examples

more about

网站地图