Skip to content

Commit ed23f62

Browse files
committed
Updated version number and documentation. Bugfix for auto-detecting restart (avoid warning)
1 parent 3d9674b commit ed23f62

File tree

101 files changed

+22058
-2145
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

101 files changed

+22058
-2145
lines changed

README.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -57,6 +57,8 @@ Additionally, the following python packages should be installed (these will be i
5757
* SciPy 0.18 or higher (http://www.scipy.org/)
5858
* Pandas 0.17 or higher (http://pandas.pydata.org/)
5959

60+
**Optional package:** Py-BOBYQA versions 1.2 and higher also support the `trustregion <https://github.com/lindonroberts/trust-region>`_ package for fast trust-region subproblem solutions. To install this, make sure you have a Fortran compiler (e.g. `gfortran <https://gcc.gnu.org/wiki/GFortran>`_) and NumPy installed, then run :code:`pip install trustregion`. You do not have to have trustregion installed for Py-BOBYQA to work, and it is not installed by default.
61+
6062
Installation using pip
6163
----------------------
6264
For easy installation, use `pip <http://www.pip-installer.org/>`_ as root:

docs/build/doctrees/advanced.doctree

54.8 KB
Binary file not shown.
21 KB
Binary file not shown.
30 KB
Binary file not shown.

docs/build/doctrees/history.doctree

12.4 KB
Binary file not shown.

docs/build/doctrees/index.doctree

17.9 KB
Binary file not shown.

docs/build/doctrees/info.doctree

21.4 KB
Binary file not shown.

docs/build/doctrees/install.doctree

17.8 KB
Binary file not shown.

docs/build/doctrees/userguide.doctree

85.9 KB
Binary file not shown.

docs/build/html/.buildinfo

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
# Sphinx build info version 1
22
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
3-
config: d43f685d9691e4e32916b1166fef01dd
3+
config: 2f9ae8a9ee55c554092f8148691bface
44
tags: 645f666f9bcd5a90fca523b33c5a78b7

docs/build/html/_sources/history.rst.txt

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,3 +27,11 @@ Version 1.1.1 (5 Apr 2019)
2727
--------------------------
2828
* Link code to Zenodo, to create DOI - no changes to the Py-BOBYQA algorithm.
2929

30+
Version 1.2 (25 Feb 2020)
31+
-------------------------
32+
* Use deterministic initialisation by default (so it is no longer necessary to set a random seed for reproducibility of Py-BOBYQA results).
33+
* Full model Hessian stored rather than just upper triangular part - this improves the runtime of Hessian-based operations.
34+
* Faster trust-region and geometry subproblem solutions in Fortran using the `trustregion <https://github.com/lindonroberts/trust-region>`_ package.
35+
* Don’t adjust starting point if it is close to the bounds (as long as it is feasible).
36+
* Option to stop default logging behavior and/or enable per-iteration printing.
37+
* Bugfix: correctly handle 1-sided bounds as inputs, avoid divide-by-zero warnings when auto-detecting restarts.

docs/build/html/_sources/index.rst.txt

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,8 @@ That is, Py-BOBYQA solves
2121
\min_{x\in\mathbb{R}^n} &\quad f(x)\\
2222
\text{s.t.} &\quad a \leq x \leq b
2323
24+
The upper and lower bounds on the variables are non-relaxable (i.e. Py-BOBYQA will never ask to evaluate a point outside the bounds).
25+
2426
Full details of the Py-BOBYQA algorithm are given in our papers:
2527

2628
1. Coralia Cartis, Jan Fiala, Benjamin Marteau and Lindon Roberts, `Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers <https://arxiv.org/abs/1804.00154>`_, technical report, University of Oxford, (2018).

docs/build/html/_sources/install.rst.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,7 @@ Additionally, the following python packages should be installed (these will be i
1313
* `SciPy 0.18 or higher <http://www.scipy.org/>`_
1414
* `Pandas 0.17 or higher <https://pandas.pydata.org/>`_
1515

16+
**Optional package:** Py-BOBYQA versions 1.2 and higher also support the `trustregion <https://github.com/lindonroberts/trust-region>`_ package for fast trust-region subproblem solutions. To install this, make sure you have a Fortran compiler (e.g. `gfortran <https://gcc.gnu.org/wiki/GFortran>`_) and NumPy installed, then run :code:`pip install trustregion`. You do not have to have trustregion installed for Py-BOBYQA to work, and it is not installed by default.
1617

1718
Installation using pip
1819
----------------------

docs/build/html/_sources/userguide.rst.txt

Lines changed: 65 additions & 51 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ Py-BOBYQA is designed to solve the local optimization problem
1111
\min_{x\in\mathbb{R}^n} &\quad f(x) \\
1212
\text{s.t.} &\quad a \leq x \leq b
1313
14-
where the bound constraints :math:`a \leq x \leq b` are optional. The objective function :math:`f(x)` is usually nonlinear and nonquadratic. If you know your objective is linear or quadratic, you should consider a solver designed for such functions (see `here <https://neos-guide.org/Optimization-Guide>`_ for details).
14+
where the bound constraints :math:`a \leq x \leq b` are optional. The upper and lower bounds on the variables are non-relaxable (i.e. Py-BOBYQA will never ask to evaluate a point outside the bounds). The objective function :math:`f(x)` is usually nonlinear and nonquadratic. If you know your objective is linear or quadratic, you should consider a solver designed for such functions (see `here <https://neos-guide.org/Optimization-Guide>`_ for details).
1515

1616
Py-BOBYQA iteratively constructs an interpolation-based model for the objective, and determines a step using a trust-region framework.
1717
For an in-depth technical description of the algorithm see the paper [CFMR2018]_, and for the global optimization heuristic, see [CRO2018]_.
@@ -69,7 +69,8 @@ The :code:`solve` function has several optional arguments which the user may pro
6969
rhobeg=None, rhoend=1e-8, maxfun=None, nsamples=None,
7070
user_params=None, objfun_has_noise=False,
7171
seek_global_minimum=False,
72-
scaling_within_bounds=False)
72+
scaling_within_bounds=False,
73+
do_logging=True, print_progress=False)
7374
7475
These arguments are:
7576

@@ -84,6 +85,8 @@ These arguments are:
8485
* :code:`objfun_has_noise` - a flag to indicate whether or not :code:`objfun` has stochastic noise; i.e. will calling :code:`objfun(x)` multiple times at the same value of :code:`x` give different results? This is used to set some sensible default parameters (including using multiple restarts), all of which can be overridden by the values provided in :code:`user_params`.
8586
* :code:`seek_global_minimum` - a flag to indicate whether to search for a global minimum, rather than a local minimum. This is used to set some sensible default parameters, all of which can be overridden by the values provided in :code:`user_params`. If :code:`True`, both upper and lower bounds must be set. Note that Py-BOBYQA only implements a heuristic method, so there are no guarantees it will find a global minimum. However, by using this flag, it is more likely to escape local minima if there are better values nearby. The method used is a multiple restart mechanism, where we repeatedly re-initialize Py-BOBYQA from the best point found so far, but where we use a larger trust reigon radius each time (note: this is different to more common multi-start approach to global optimization).
8687
* :code:`scaling_within_bounds` - a flag to indicate whether the algorithm should internally shift and scale the entries of :code:`x` so that the bounds become :math:`0 \leq x \leq 1`. This is useful is you are setting :code:`bounds` and the bounds have different orders of magnitude. If :code:`scaling_within_bounds=True`, the values of :code:`rhobeg` and :code:`rhoend` apply to the *shifted* variables.
88+
* :code:`do_logging` - a flag to indicate whether logging output should be produced. This is not automatically visible unless you use the Python `logging <https://docs.python.org/3/library/logging.html>`_ module (see below for simple usage).
89+
* :code:`print_progress` - a flag to indicate whether to print a per-iteration progress log to terminal.
8790

8891
In general when using optimization software, it is good practice to scale your variables so that moving each by a given amount has approximately the same impact on the objective function.
8992
The :code:`scaling_within_bounds` flag is designed to provide an easy way to achieve this, if you have set the bounds :code:`lower` and :code:`upper`.
@@ -112,9 +115,6 @@ This function has exactly one local minimum :math:`f(x_{min})=0` at :math:`x_{mi
112115
# Define the starting point
113116
x0 = np.array([-1.2, 1.0])
114117
115-
# Set random seed (for reproducibility)
116-
np.random.seed(0)
117-
118118
# Call Py-BOBYQA
119119
soln = pybobyqa.solve(rosenbrock, x0)
120120
@@ -126,12 +126,12 @@ Note that Py-BOBYQA is a randomized algorithm: in its first phase, it builds an
126126
.. code-block:: none
127127
128128
****** Py-BOBYQA Results ******
129-
Solution xmin = [ 1. 1.]
130-
Objective value f(xmin) = 2.964036794e-19
131-
Needed 213 objective evaluations (at 213 points)
132-
Approximate gradient = [ -2.57280154e-08 1.26855793e-08]
133-
Approximate Hessian = [[ 802.90904563 -400.46022134]
134-
[-400.46022134 200.23335154]]
129+
Solution xmin = [1. 1.]
130+
Objective value f(xmin) = 1.013856052e-20
131+
Needed 151 objective evaluations (at 151 points)
132+
Approximate gradient = [ 2.35772499e-08 -1.07598803e-08]
133+
Approximate Hessian = [[ 802.00799968 -400.04089119]
134+
[-400.04089119 199.99228723]]
135135
Exit flag = 0
136136
Success: rho has reached rhoend
137137
******************************
@@ -156,12 +156,12 @@ Py-BOBYQA correctly finds the solution to the constrained problem:
156156
.. code-block:: none
157157
158158
****** Py-BOBYQA Results ******
159-
Solution xmin = [ 0.9 0.81]
159+
Solution xmin = [0.9 0.81]
160160
Objective value f(xmin) = 0.01
161-
Needed 134 objective evaluations (at 134 points)
162-
Approximate gradient = [ -1.99999226e-01 -4.31078784e-07]
163-
Approximate Hessian = [[ 649.6790222 -360.18361979]
164-
[-360.18361979 200.00205196]]
161+
Needed 146 objective evaluations (at 146 points)
162+
Approximate gradient = [-2.00000006e-01 -4.74578563e-09]
163+
Approximate Hessian = [[ 649.66398033 -361.03094781]
164+
[-361.03094781 199.94223213]]
165165
Exit flag = 0
166166
Success: rho has reached rhoend
167167
******************************
@@ -188,12 +188,12 @@ And we can now see each evaluation of :code:`objfun`:
188188
.. code-block:: none
189189
190190
Function eval 1 at point 1 has f = 39.65 at x = [-1.2 0.85]
191-
Initialising (random directions)
191+
Initialising (coordinate directions)
192192
Function eval 2 at point 2 has f = 14.337296 at x = [-1.08 0.85]
193193
Function eval 3 at point 3 has f = 55.25 at x = [-1.2 0.73]
194194
...
195-
Function eval 133 at point 133 has f = 0.0100000000000165 at x = [ 0.9 0.81000001]
196-
Function eval 134 at point 134 has f = 0.00999999999999997 at x = [ 0.9 0.81]
195+
Function eval 145 at point 145 has f = 0.0100000013172792 at x = [0.89999999 0.80999999]
196+
Function eval 146 at point 146 has f = 0.00999999999999993 at x = [0.9 0.81]
197197
Did a total of 1 run(s)
198198
199199
If we wanted to save this output to a file, we could replace the above call to :code:`logging.basicConfig()` with
@@ -203,6 +203,20 @@ If we wanted to save this output to a file, we could replace the above call to :
203203
logging.basicConfig(filename="myfile.log", level=logging.INFO,
204204
format='%(message)s', filemode='w')
205205
206+
If you have logging for some parts of your code and you want to deactivate all Py-BOBYQA logging, you can use the optional argument :code:`do_logging=False` in :code:`pybobyqa.solve()`.
207+
208+
An alternative option available is to get Py-BOBYQA to print to terminal progress information every iteration, by setting the optional argument :code:`print_progress=True` in :code:`pybobyqa.solve()`. If we do this for the above example, we get
209+
210+
.. code-block:: none
211+
212+
Run Iter Obj Grad Delta rho Evals
213+
1 1 1.43e+01 1.74e+02 1.20e-01 1.20e-01 5
214+
1 2 5.57e+00 1.20e+02 3.66e-01 1.20e-01 6
215+
1 3 5.57e+00 1.20e+02 6.00e-02 1.20e-02 6
216+
...
217+
1 132 1.00e-02 2.00e-01 1.50e-08 1.00e-08 144
218+
1 133 1.00e-02 2.00e-01 1.50e-08 1.00e-08 145
219+
206220
Example: Noisy Objective Evaluation
207221
-----------------------------------
208222
As described in :doc:`info`, derivative-free algorithms such as Py-BOBYQA are particularly useful when :code:`objfun` has noise. Let's modify the previous example to include random noise in our objective evaluation, and compare it to a derivative-based solver:
@@ -230,7 +244,7 @@ As described in :doc:`info`, derivative-free algorithms such as Py-BOBYQA are pa
230244
231245
print("Demonstrate noise in function evaluation:")
232246
for i in range(5):
233-
print("objfun(x0) = %s" % str(rosenbrock_noisy(x0)))
247+
print("objfun(x0) = %g" % rosenbrock_noisy(x0))
234248
print("")
235249
236250
# Call Py-BOBYQA
@@ -257,28 +271,28 @@ The output of this is:
257271
.. code-block:: none
258272
259273
Demonstrate noise in function evaluation:
260-
objfun(x0) = 24.6269006677
261-
objfun(x0) = 24.2968380444
262-
objfun(x0) = 24.4368545922
263-
objfun(x0) = 24.7422961542
264-
objfun(x0) = 24.6519490336
274+
objfun(x0) = 24.6269
275+
objfun(x0) = 24.2968
276+
objfun(x0) = 24.4369
277+
objfun(x0) = 24.7423
278+
objfun(x0) = 24.6519
265279
266280
****** Py-BOBYQA Results ******
267-
Solution xmin = [-1.02866429 1.07341548]
268-
Objective value f(xmin) = 4.033118937
269-
Needed 36 objective evaluations (at 36 points)
270-
Approximate gradient = [-6921247.2999239 -3051622.27188687]
271-
Approximate Hessian = [[ 1.98604897e+15 5.75929121e+14]
272-
[ 5.75929121e+14 7.89533101e+14]]
281+
Solution xmin = [-1.04327395 1.09935385]
282+
Objective value f(xmin) = 4.080030471
283+
Needed 42 objective evaluations (at 42 points)
284+
Approximate gradient = [-3786376.5065785 5876675.51763198]
285+
Approximate Hessian = [[ 1.32881117e+14 -2.68241358e+14]
286+
[-2.68241358e+14 6.09785319e+14]]
273287
Exit flag = 0
274288
Success: rho has reached rhoend
275289
******************************
276290
277291
278292
** SciPy results **
279-
Solution xmin = [-1.2 1. ]
280-
Objective value f(xmin) = 23.80943672
281-
Needed 104 objective evaluations
293+
Solution xmin = [-1.20013817 0.99992915]
294+
Objective value f(xmin) = 23.86371763
295+
Needed 80 objective evaluations
282296
Exit flag = 2
283297
Desired error not necessarily achieved due to precision loss.
284298
@@ -295,13 +309,13 @@ This time, we find the true solution, and better estimates of the gradient and H
295309
.. code-block:: none
296310
297311
****** Py-BOBYQA Results ******
298-
Solution xmin = [ 1. 1.]
299-
Objective value f(xmin) = 3.418770987e-18
312+
Solution xmin = [1. 1.]
313+
Objective value f(xmin) = 1.237351799e-19
300314
Needed 300 objective evaluations (at 300 points)
301-
Did a total of 4 runs
302-
Approximate gradient = [ -1.36175005e-08 2.12249758e-09]
303-
Approximate Hessian = [[ 805.93202374 -394.16671315]
304-
[-394.16671315 192.99451721]]
315+
Did a total of 5 runs
316+
Approximate gradient = [-2.17072738e-07 9.62304351e-08]
317+
Approximate Hessian = [[ 809.56521044 -400.33737779]
318+
[-400.33737779 198.36487985]]
305319
Exit flag = 1
306320
Warning (max evals): Objective has been called MAXFUN times
307321
******************************
@@ -362,12 +376,12 @@ The output of this is:
362376
First run - search for local minimum only
363377
364378
****** Py-BOBYQA Results ******
365-
Solution xmin = [ 11.41277906 -0.89680525]
379+
Solution xmin = [11.41277902 -0.89680525]
366380
Objective value f(xmin) = 48.98425368
367-
Needed 203 objective evaluations (at 203 points)
368-
Approximate gradient = [ -1.61348180e-06 -3.61662651e-07]
369-
Approximate Hessian = [[ 132.10265455 -45.5426821 ]
370-
[ -45.5426821 976.15808779]]
381+
Needed 143 objective evaluations (at 143 points)
382+
Approximate gradient = [-1.64941396e-07 9.69795781e-07]
383+
Approximate Hessian = [[ 7.74717421 -104.51102613]
384+
[-104.51102613 1135.76500421]]
371385
Exit flag = 0
372386
Success: rho has reached rhoend
373387
******************************
@@ -377,13 +391,13 @@ The output of this is:
377391
Second run - search for global minimum
378392
379393
****** Py-BOBYQA Results ******
380-
Solution xmin = [ 5. 4.]
381-
Objective value f(xmin) = 9.734692105e-19
394+
Solution xmin = [5. 4.]
395+
Objective value f(xmin) = 3.659891409e-17
382396
Needed 500 objective evaluations (at 500 points)
383-
Did a total of 4 runs
384-
Approximate gradient = [ 4.28964221e-08 4.58344260e-07]
385-
Approximate Hessian = [[ 4.06992486 61.15006935]
386-
[ 61.15006935 3728.06826545]]
397+
Did a total of 5 runs
398+
Approximate gradient = [ 8.70038835e-10 -4.64918043e-07]
399+
Approximate Hessian = [[ 4.28883646 64.16836253]
400+
[ 64.16836253 3722.93109385]]
387401
Exit flag = 1
388402
Warning (max evals): Objective has been called MAXFUN times
389403
******************************

0 commit comments

Comments
 (0)