Skip to content

Commit 44385e2

Browse files
committed
Updated documentation to clarify global optimization heuristic
1 parent 42cacdd commit 44385e2

File tree

3 files changed

+3
-3
lines changed

3 files changed

+3
-3
lines changed

docs/index.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ Full details of the Py-BOBYQA algorithm are given in our paper: C. Cartis, J. Fi
2525

2626
If you are interested in solving least-squares minimization problems, you may wish to try `DFO-LS <https://github.com/numericalalgorithmsgroup/dfols>`_, which has the same features as Py-BOBYQA (plus some more), and exploits the least-squares problem structure, so performs better on such problems.
2727

28-
**New feature: global optimization (July 2018)!** Py-BOBYQA now has a heuristic for global optimization (see :doc:`userguide` for details). As this is a heuristic, there are no guarantees it will find a global minimum, but it is more likely to escape local minima if there are better values nearby.
28+
**New feature: global optimization heuristic (July 2018)!** Py-BOBYQA now has a heuristic for global optimization (see :doc:`userguide` for details). As this is a heuristic, there are no guarantees it will find a global minimum, but it is more likely to escape local minima if there are better values nearby.
2929

3030
Py-BOBYQA is released under the GNU General Public License. Please `contact NAG <http://www.nag.com/content/worldwide-contact-information>`_ for alternative licensing.
3131

docs/userguide.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -75,14 +75,14 @@ These arguments are:
7575

7676
* :code:`args` - a tuple of extra arguments passed to the objective function.
7777
* :code:`bounds` - a tuple :code:`(lower, upper)` with the vectors :math:`a` and :math:`b` of lower and upper bounds on :math:`x` (default is :math:`a_i=-10^{20}` and :math:`b_i=10^{20}`). To set bounds for either :code:`lower` or :code:`upper`, but not both, pass a tuple :code:`(lower, None)` or :code:`(None, upper)`.
78-
* :code:`npt` - the number of interpolation points to use (default is :code:`2*len(x0)+1`). Py-BOBYQA requires :code:`n+1 <= npt <= (n+1)*(n+2)/2` for a problem with :code:`len(x0)=n`. Larger values are particularly useful for noisy problems.
78+
* :code:`npt` - the number of interpolation points to use (default is :math:`2n+1` for a problem with :code:`len(x0)=n` if :code:`objfun_has_noise=False`, otherwise it is set to :math:`(n+1)(n+2)/2`). Py-BOBYQA requires :math:`n+1 \leq npt \leq (n+1)(n+2)/2`. Larger values are particularly useful for noisy problems.
7979
* :code:`rhobeg` - the initial value of the trust region radius (default is 0.1 if :code:`scaling_within_bounds=True`, otherwise :math:`0.1\max(\|x_0\|_{\infty}, 1)`).
8080
* :code:`rhoend` - minimum allowed value of trust region radius, which determines when a successful termination occurs (default is :math:`10^{-8}`).
8181
* :code:`maxfun` - the maximum number of objective evaluations the algorithm may request (default is :math:`\min(100(n+1),1000)`).
8282
* :code:`nsamples` - a Python function :code:`nsamples(delta, rho, iter, nrestarts)` which returns the number of times to evaluate :code:`objfun` at a given point. This is only applicable for objectives with stochastic noise, when averaging multiple evaluations at the same point produces a more accurate value. The input parameters are the trust region radius (:code:`delta`), the lower bound on the trust region radius (:code:`rho`), how many iterations the algorithm has been running for (:code:`iter`), and how many restarts have been performed (:code:`nrestarts`). Default is no averaging (i.e. :code:`nsamples(delta, rho, iter, nrestarts)=1`).
8383
* :code:`user_params` - a Python dictionary :code:`{'param1': val1, 'param2':val2, ...}` of optional parameters. A full list of available options is given in the next section :doc:`advanced`.
8484
* :code:`objfun_has_noise` - a flag to indicate whether or not :code:`objfun` has stochastic noise; i.e. will calling :code:`objfun(x)` multiple times at the same value of :code:`x` give different results? This is used to set some sensible default parameters (including using multiple restarts), all of which can be overridden by the values provided in :code:`user_params`.
85-
* :code:`seek_global_minimum` - a flag to indicate whether to search for a global minimum, rather than a local minimum. This is used to set some sensible default parameters (including using multiple restarts), all of which can be overridden by the values provided in :code:`user_params`. If :code:`True`, both upper and lower bounds must be set. Note that Py-BOBYQA only implements a heuristic method, so there are no guarantees it will find a global minimum. However, by using this flag, it is more likely to escape local minima if there are better values nearby.
85+
* :code:`seek_global_minimum` - a flag to indicate whether to search for a global minimum, rather than a local minimum. This is used to set some sensible default parameters, all of which can be overridden by the values provided in :code:`user_params`. If :code:`True`, both upper and lower bounds must be set. Note that Py-BOBYQA only implements a heuristic method, so there are no guarantees it will find a global minimum. However, by using this flag, it is more likely to escape local minima if there are better values nearby. The method used is a multiple restart mechanism, where we repeatedly re-initialize Py-BOBYQA from the best point found so far, but where we use a larger trust reigon radius each time (note: this is different to more common multi-start approach to global optimization).
8686
* :code:`scaling_within_bounds` - a flag to indicate whether the algorithm should internally shift and scale the entries of :code:`x` so that the bounds become :math:`0 \leq x \leq 1`. This is useful is you are setting :code:`bounds` and the bounds have different orders of magnitude. If :code:`scaling_within_bounds=True`, the values of :code:`rhobeg` and :code:`rhoend` apply to the *shifted* variables.
8787

8888
In general when using optimization software, it is good practice to scale your variables so that moving each by a given amount has approximately the same impact on the objective function.

manual.pdf

631 Bytes
Binary file not shown.

0 commit comments

Comments
 (0)