Using Py-BOBYQA

This section describes the main interface to Py-BOBYQA and how to use it.

Nonlinear Minimization

Py-BOBYQA is designed to solve the local optimization problem

\[\begin{split}\min_{x\in\mathbb{R}^n} &\quad f(x) \\ \text{s.t.} &\quad a \leq x \leq b\\ &\quad x \in C := C_1 \cap \cdots \cap C_n, \quad \text{all $C_i$ convex}\end{split}\]

where the bound constraints \(a \leq x \leq b\) and general convex constraints \(x \in C\) are optional. The upper and lower bounds on the variables are non-relaxable (i.e. Py-BOBYQA will never ask to evaluate a point outside the bounds). The general convex constraints are also non-relaxable, but they may be slightly violated at some points from rounding errors. The objective function \(f(x)\) is usually nonlinear and nonquadratic. If you know your objective is linear or quadratic, you should consider a solver designed for such functions (see here for details).

Py-BOBYQA iteratively constructs an interpolation-based model for the objective, and determines a step using a trust-region framework. For an in-depth technical description of the algorithm see the paper [CFMR2018], and for the global optimization heuristic, see [CRO2022]. For details about how Py-BOBYQA handles general convex constraints, see [R2024].

How to use Py-BOBYQA

The main interface to Py-BOBYQA is via the function solve

soln = pybobyqa.solve(objfun, x0)

The input objfun is a Python function which takes an input \(x\in\mathbb{R}^n\) and returns the objective value \(f(x)\in\mathbb{R}\). The input of objfun must be one-dimensional NumPy arrays (i.e. with x.shape == (n,)) and the output must be a single Float.

The input x0 is the starting point for the solver, and (where possible) should be set to be the best available estimate of the true solution \(x_{min}\in\mathbb{R}^n\). It should be specified as a one-dimensional NumPy array (i.e. with x0.shape == (n,)). As Py-BOBYQA is a local solver, providing different values for x0 may cause it to return different solutions, with possibly different objective values.

The output of pybobyqa.solve is an object containing:

  • soln.x - an estimate of the solution, \(x_{min}\in\mathbb{R}^n\), a one-dimensional NumPy array.

  • soln.f - the objective value at the calculated solution, \(f(x_{min})\), a Float.

  • soln.gradient - an estimate of the gradient vector of first derivatives of the objective, \(g_i \approx \partial f(x_{min})/\partial x_i\), a NumPy array of length \(n\).

  • soln.hessian - an estimate of the Hessian matrix of second derivatives of the objective, \(H_{i,j} \approx \partial^2 f(x_{min})/\partial x_i \partial x_j\), a NumPy array of size \(n\times n\).

  • soln.nf - the number of evaluations of objfun that the algorithm needed, an Integer.

  • soln.nx - the number of points \(x\) at which objfun was evaluated, an Integer. This may be different to soln.nf if sample averaging is used.

  • soln.nruns - the number of runs performed by Py-BOBYQA (more than 1 if using multiple restarts), an Integer.

  • soln.flag - an exit flag, which can take one of several values (listed below), an Integer.

  • soln.msg - a description of why the algorithm finished, a String.

  • soln.diagnostic_info - a table of diagnostic information showing the progress of the solver, a Pandas DataFrame.

The possible values of soln.flag are defined by the following variables:

  • soln.EXIT_SUCCESS - Py-BOBYQA terminated successfully (the objective value or trust region radius are sufficiently small).

  • soln.EXIT_MAXFUN_WARNING - maximum allowed objective evaluations reached. This is the most likely return value when using multiple restarts.

  • soln.EXIT_SLOW_WARNING - maximum number of slow iterations reached.

  • soln.EXIT_FALSE_SUCCESS_WARNING - Py-BOBYQA reached the maximum number of restarts which decreased the objective, but to a worse value than was found in a previous run.

  • soln.EXIT_INPUT_ERROR - error in the inputs.

  • soln.EXIT_TR_INCREASE_ERROR - error occurred when solving the trust region subproblem.

  • soln.EXIT_LINALG_ERROR - linear algebra error, e.g. the interpolation points produced a singular linear system.

These variables are defined in the soln object, so can be accessed with, for example

if soln.flag == soln.EXIT_SUCCESS:
    print("Success!")

Optional Arguments

The solve function has several optional arguments which the user may provide:

pybobyqa.solve(objfun, x0, args=(), bounds=None, projections=None,
               npt=None, rhobeg=None, rhoend=1e-8, maxfun=None,
               nsamples=None, user_params=None, objfun_has_noise=False,
               seek_global_minimum=False,
               scaling_within_bounds=False,
               do_logging=True, print_progress=False)

These arguments are:

  • args - a tuple of extra arguments passed to the objective function.

  • bounds - a tuple (lower, upper) with the vectors \(a\) and \(b\) of lower and upper bounds on \(x\) (default is \(a_i=-10^{20}\) and \(b_i=10^{20}\)). To set bounds for either lower or upper, but not both, pass a tuple (lower, None) or (None, upper).

  • projections - a list of functions defining the Euclidean projections for each general convex constraint \(C_i\). Each element of the list projections is a function that takes an input vector \(x\) and returns the closest point to \(x\) that is in \(C_i\). An example of using this is given below.

  • npt - the number of interpolation points to use (default is \(2n+1\) for a problem with len(x0)=n if objfun_has_noise=False, otherwise it is set to \((n+1)(n+2)/2\)). Py-BOBYQA requires \(n+1 \leq npt \leq (n+1)(n+2)/2\). Larger values are particularly useful for noisy problems.

  • rhobeg - the initial value of the trust region radius (default is 0.1 if scaling_within_bounds=True, otherwise \(0.1\max(\|x_0\|_{\infty}, 1)\)).

  • rhoend - minimum allowed value of trust region radius, which determines when a successful termination occurs (default is \(10^{-8}\)).

  • maxfun - the maximum number of objective evaluations the algorithm may request (default is \(\min(100(n+1),1000)\)).

  • nsamples - a Python function nsamples(delta, rho, iter, nrestarts) which returns the number of times to evaluate objfun at a given point. This is only applicable for objectives with stochastic noise, when averaging multiple evaluations at the same point produces a more accurate value. The input parameters are the trust region radius (delta), the lower bound on the trust region radius (rho), how many iterations the algorithm has been running for (iter), and how many restarts have been performed (nrestarts). Default is no averaging (i.e. nsamples(delta, rho, iter, nrestarts)=1).

  • user_params - a Python dictionary {'param1': val1, 'param2':val2, ...} of optional parameters. A full list of available options is given in the next section Advanced Usage.

  • objfun_has_noise - a flag to indicate whether or not objfun has stochastic noise; i.e. will calling objfun(x) multiple times at the same value of x give different results? This is used to set some sensible default parameters (including using multiple restarts), all of which can be overridden by the values provided in user_params.

  • seek_global_minimum - a flag to indicate whether to search for a global minimum, rather than a local minimum. This is used to set some sensible default parameters, all of which can be overridden by the values provided in user_params. If True, both upper and lower bounds must be set. Note that Py-BOBYQA only implements a heuristic method, so there are no guarantees it will find a global minimum. However, by using this flag, it is more likely to escape local minima if there are better values nearby. The method used is a multiple restart mechanism, where we repeatedly re-initialize Py-BOBYQA from the best point found so far, but where we use a larger trust reigon radius each time (note: this is different to more common multi-start approach to global optimization).

  • scaling_within_bounds - a flag to indicate whether the algorithm should internally shift and scale the entries of x so that the bounds become \(0 \leq x \leq 1\). This is useful is you are setting bounds and the bounds have different orders of magnitude. If scaling_within_bounds=True, the values of rhobeg and rhoend apply to the shifted variables.

  • do_logging - a flag to indicate whether logging output should be produced. This is not automatically visible unless you use the Python logging module (see below for simple usage).

  • print_progress - a flag to indicate whether to print a per-iteration progress log to terminal.

In general when using optimization software, it is good practice to scale your variables so that moving each by a given amount has approximately the same impact on the objective function. The scaling_within_bounds flag is designed to provide an easy way to achieve this, if you have set the bounds lower and upper.

A Simple Example

Suppose we wish to minimize the Rosenbrock test function:

\[\begin{split}\min_{(x_1,x_2)\in\mathbb{R}^2} &\quad 100(x_2-x_1^2)^2 + (1-x_1)^2 \\\end{split}\]

This function has exactly one local minimum \(f(x_{min})=0\) at \(x_{min}=(1,1)\). A commonly-used starting point for testing purposes is \(x_0=(-1.2,1)\). The following script shows how to solve this problem using Py-BOBYQA:

# Py-BOBYQA example: minimize the Rosenbrock function
from __future__ import print_function
import numpy as np
import pybobyqa

# Define the objective function
def rosenbrock(x):
    return 100.0 * (x[1] - x[0] ** 2) ** 2 + (1.0 - x[0]) ** 2

# Define the starting point
x0 = np.array([-1.2, 1.0])

# Call Py-BOBYQA
soln = pybobyqa.solve(rosenbrock, x0)

# Display output
print(soln)

Note that Py-BOBYQA is a randomized algorithm: in its first phase, it builds an internal approximation to the objective function by sampling it along random directions. In the code above, we set NumPy’s random seed for reproducibility over multiple runs, but this is not required. The output of this script, showing that Py-BOBYQA finds the correct solution, is

****** Py-BOBYQA Results ******
Solution xmin = [1. 1.]
Objective value f(xmin) = 1.013856052e-20
Needed 151 objective evaluations (at 151 points)
Approximate gradient = [ 2.35772499e-08 -1.07598803e-08]
Approximate Hessian = [[ 802.00799968 -400.04089119]
 [-400.04089119  199.99228723]]
Exit flag = 0
Success: rho has reached rhoend
******************************

This and all following problems can be found in the examples directory on the Py-BOBYQA Github page.

Adding Bounds and More Output

We can extend the above script to add constraints. To do this, we can add the lines

# Define bound constraints (lower <= x <= upper)
lower = np.array([-10.0, -10.0])
upper = np.array([0.9, 0.85])

# Call Py-BOBYQA (with bounds)
soln = pybobyqa.solve(rosenbrock, x0, bounds=(lower,upper))

Py-BOBYQA correctly finds the solution to the constrained problem:

****** Py-BOBYQA Results ******
Solution xmin = [0.9  0.81]
Objective value f(xmin) = 0.01
Needed 146 objective evaluations (at 146 points)
Approximate gradient = [-2.00000006e-01 -4.74578563e-09]
Approximate Hessian = [[ 649.66398033 -361.03094781]
 [-361.03094781  199.94223213]]
Exit flag = 0
Success: rho has reached rhoend
******************************

However, we also get a warning that our starting point was outside of the bounds:

RuntimeWarning: x0 above upper bound, adjusting

Py-BOBYQA automatically fixes this, and moves \(x_0\) to a point within the bounds, in this case \(x_0=(-1.2,0.85)\).

We can also get Py-BOBYQA to print out more detailed information about its progress using the logging module. To do this, we need to add the following lines:

import logging
logging.basicConfig(level=logging.INFO, format='%(message)s')

# ... (call pybobyqa.solve)

And we can now see each evaluation of objfun:

Function eval 1 at point 1 has f = 39.65 at x = [-1.2   0.85]
Initialising (coordinate directions)
Function eval 2 at point 2 has f = 14.337296 at x = [-1.08  0.85]
Function eval 3 at point 3 has f = 55.25 at x = [-1.2   0.73]
...
Function eval 145 at point 145 has f = 0.0100000013172792 at x = [0.89999999 0.80999999]
Function eval 146 at point 146 has f = 0.00999999999999993 at x = [0.9  0.81]
Did a total of 1 run(s)

If we wanted to save this output to a file, we could replace the above call to logging.basicConfig() with

logging.basicConfig(filename="myfile.log", level=logging.INFO,
                    format='%(message)s', filemode='w')

If you have logging for some parts of your code and you want to deactivate all Py-BOBYQA logging, you can use the optional argument do_logging=False in pybobyqa.solve().

An alternative option available is to get Py-BOBYQA to print to terminal progress information every iteration, by setting the optional argument print_progress=True in pybobyqa.solve(). If we do this for the above example, we get

 Run  Iter     Obj       Grad     Delta      rho     Evals
  1     1    1.43e+01  1.74e+02  1.20e-01  1.20e-01    5
  1     2    5.57e+00  1.20e+02  3.66e-01  1.20e-01    6
  1     3    5.57e+00  1.20e+02  6.00e-02  1.20e-02    6
...
  1    132   1.00e-02  2.00e-01  1.50e-08  1.00e-08   144
  1    133   1.00e-02  2.00e-01  1.50e-08  1.00e-08   145

Adding General Convex Constraints

We can also add more general convex constraints \(x \in C := C_1 \cap \cdots \cap C_n\) to our problem, where each \(C_i\) is a convex set. To do this, we need to know the Euclidean projection operator for each \(C_i\):

\[\operatorname{proj}_{C_i}(x) := \operatorname{argmin}_{y\in C_i} \|y-x\|_2^2.\]

i.e. given a point \(x\), return the closest point to \(x\) in the set \(C_i\). There are many examples of simple convex sets \(C_i\) for which this function has a known, simple form, such as:

  • Bound constraints (but since Py-BOBYQA supports this directly, it is better to give these explicitly via the bounds input, as above)

  • Euclidean ball constraints: \(\|x-c\|_2 \leq r\)

  • Unit simplex: \(x_i \geq 0\) and \(\sum_{i=1}^{n} x_i \leq 1\)

  • Linear inequalities: \(a^T x \geq b\)

In Py-BOBYQA, set the input projections to be a list of projection functions, one per \(C_i\). Internally, Py-BOBYQA computes the projection onto the intersection of these sets and the bound constraints using Dykstra’s projection algorithm.

For the explicit expressions for the above projections, and more examples, see for example this online database or Section 6.4.6 of the textbook [B2017].

As an example, let’s minimize the above Rosenbrock function with different bounds, and with a Euclidean ball constraint, namely \((x_1-0.5)^2 + (x_2-1)^2 \leq 0.25^2\).

To do this, we can run

import numpy as np
import pybobyqa

# Define the objective function
def rosenbrock(x):
    return 100.0 * (x[1] - x[0] ** 2) ** 2 + (1.0 - x[0]) ** 2

# Define the starting point
x0 = np.array([-1.2, 1.0])

# Define bound constraints (lower <= x <= upper)
lower = np.array([0.7, -2.0])
upper = np.array([1.0, 2.0])

# Define the ball constraint ||x-center|| <= radius, and its projection operator
center = np.array([0.5, 1.0])
radius = 0.25
ball_proj = lambda x: center + (radius/max(np.linalg.norm(x-center), radius)) * (x-center)

# Call Py-BOBYQA (with bounds and projection operator)
# Note: it is better to provide bounds explicitly, instead of using the corresponding
#       projection function
# Note: input 'projections' must be a list of projection functions
soln = pybobyqa.solve(rosenbrock, x0, bounds=(lower,upper), projections=[ball_proj])

print(soln)

Py-BOBYQA correctly finds the solution to the constrained problem:

****** Py-BOBYQA Results ******
Solution xmin = [0.70424386 0.85583188]
Objective value f(xmin) = 13.03829114
Needed 25 objective evaluations (at 25 points)
Approximate gradient = [-101.9667031    71.97449424]
Approximate Hessian = [[ 253.11858775 -279.39193327]
 [-279.39193327  201.49725358]]
Exit flag = 0
Success: rho has reached rhoend
******************************

Just like for bound constraints, Py-BOBYQA will automatically ensure the starting point is feasible with respect to all constraints (bounds and general convex constraints).

Example: Noisy Objective Evaluation

As described in Overview, derivative-free algorithms such as Py-BOBYQA are particularly useful when objfun has noise. Let’s modify the previous example to include random noise in our objective evaluation, and compare it to a derivative-based solver:

# Py-BOBYQA example: minimize the noisy Rosenbrock function
from __future__ import print_function
import numpy as np
import pybobyqa

# Define the objective function
def rosenbrock(x):
    return 100.0 * (x[1] - x[0] ** 2) ** 2 + (1.0 - x[0]) ** 2

# Modified objective function: add 1% Gaussian noise
def rosenbrock_noisy(x):
    return rosenbrock(x) * (1.0 + 1e-2 * np.random.normal(size=(1,))[0])

# Define the starting point
x0 = np.array([-1.2, 1.0])

# Set random seed (for reproducibility)
np.random.seed(0)

print("Demonstrate noise in function evaluation:")
for i in range(5):
    print("objfun(x0) = %g" % rosenbrock_noisy(x0))
print("")

# Call Py-BOBYQA
soln = pybobyqa.solve(rosenbrock_noisy, x0)

# Display output
print(soln)

# Compare with a derivative-based solver
import scipy.optimize as opt
soln = opt.minimize(rosenbrock_noisy, x0)

print("")
print("** SciPy results **")
print("Solution xmin = %s" % str(soln.x))
print("Objective value f(xmin) = %.10g" % (soln.fun))
print("Needed %g objective evaluations" % soln.nfev)
print("Exit flag = %g" % soln.status)
print(soln.message)

The output of this is:

Demonstrate noise in function evaluation:
objfun(x0) = 24.6269
objfun(x0) = 24.2968
objfun(x0) = 24.4369
objfun(x0) = 24.7423
objfun(x0) = 24.6519

****** Py-BOBYQA Results ******
Solution xmin = [-1.04327395  1.09935385]
Objective value f(xmin) = 4.080030471
Needed 42 objective evaluations (at 42 points)
Approximate gradient = [-3786376.5065785   5876675.51763198]
Approximate Hessian = [[ 1.32881117e+14 -2.68241358e+14]
 [-2.68241358e+14  6.09785319e+14]]
Exit flag = 0
Success: rho has reached rhoend
******************************


** SciPy results **
Solution xmin = [-1.20013817  0.99992915]
Objective value f(xmin) = 23.86371763
Needed 80 objective evaluations
Exit flag = 2
Desired error not necessarily achieved due to precision loss.

Although Py-BOBYQA does not find the true solution (and it cannot produce a good estimate of the objective gradient and Hessian), it still gives a reasonable decrease in the objective. However SciPy’s derivative-based solver, which has no trouble solving the noise-free problem, is unable to make any progress.

As noted above, Py-BOBYQA has an input parameter objfun_has_noise to indicate if objfun has noise in it, which it does in this case. Therefore we can call Py-BOBYQA with

soln = pybobyqa.solve(rosenbrock_noisy, x0, objfun_has_noise=True)

This time, we find the true solution, and better estimates of the gradient and Hessian:

****** Py-BOBYQA Results ******
Solution xmin = [1. 1.]
Objective value f(xmin) = 1.237351799e-19
Needed 300 objective evaluations (at 300 points)
Did a total of 5 runs
Approximate gradient = [-2.17072738e-07  9.62304351e-08]
Approximate Hessian = [[ 809.56521044 -400.33737779]
 [-400.33737779  198.36487985]]
Exit flag = 1
Warning (max evals): Objective has been called MAXFUN times
******************************

Example: Global Optimization

The following example shows how to use the global optimization features of Py-BOBYQA. Here, we try to minimize the Freudenstein and Roth function (problem 2 in J.J. Moré, B.S. Garbow, B.S. and K.E. Hillstrom, Testing Unconstrained Optimization Software, ACM Trans. Math. Software 7:1 (1981), 17-41). This function has two local minima, one of which is global.

Note that Py-BOBYQA only implements a heuristic method, so there are no guarantees it will find a global minimum. However, by using the seek_global_minimum flag, it is more likely to escape local minima if there are better values nearby.

# Py-BOBYQA example: globally minimize the Freudenstein and Roth function
from __future__ import print_function
import numpy as np
import pybobyqa

# Define the objective function
# This function has a local minimum f = 48.98
# at x = np.array([11.41, -0.8968])
# and a global minimum f = 0 at x = np.array([5.0, 4.0])
def freudenstein_roth(x):
    r1 = -13.0 + x[0] + ((5.0 - x[1]) * x[1] - 2.0) * x[1]
    r2 = -29.0 + x[0] + ((1.0 + x[1]) * x[1] - 14.0) * x[1]
    return r1 ** 2 + r2 ** 2

# Define the starting point
x0 = np.array([5.0, -20.0])

# Define bounds (required for global optimization)
lower = np.array([-30.0, -30.0])
upper = np.array([30.0, 30.0])

# Set random seed (for reproducibility)
np.random.seed(0)

print("First run - search for local minimum only")
print("")
soln = pybobyqa.solve(freudenstein_roth, x0, maxfun=500,
                      bounds=(lower, upper))
print(soln)

print("")
print("")

print("Second run - search for global minimum")
print("")
soln = pybobyqa.solve(freudenstein_roth, x0, maxfun=500,
                      bounds=(lower, upper),
                      seek_global_minimum=True)
print(soln)

The output of this is:

First run - search for local minimum only

****** Py-BOBYQA Results ******
Solution xmin = [11.41277902 -0.89680525]
Objective value f(xmin) = 48.98425368
Needed 143 objective evaluations (at 143 points)
Approximate gradient = [-1.64941396e-07  9.69795781e-07]
Approximate Hessian = [[   7.74717421 -104.51102613]
 [-104.51102613 1135.76500421]]
Exit flag = 0
Success: rho has reached rhoend
******************************



Second run - search for global minimum

****** Py-BOBYQA Results ******
Solution xmin = [5. 4.]
Objective value f(xmin) = 3.659891409e-17
Needed 500 objective evaluations (at 500 points)
Did a total of 5 runs
Approximate gradient = [ 8.70038835e-10 -4.64918043e-07]
Approximate Hessian = [[   4.28883646   64.16836253]
 [  64.16836253 3722.93109385]]
Exit flag = 1
Warning (max evals): Objective has been called MAXFUN times
******************************

As we can see, the seek_global_minimum flag helped Py-BOBYQA escape the local minimum from the first run, and find the global minimum. More details are given in [CRO2022].

References

[CFMR2018]

Coralia Cartis, Jan Fiala, Benjamin Marteau and Lindon Roberts, Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers, ACM Transactions on Mathematical Software, 45:3 (2019), pp. 32:1-32:41 [preprint]

[CRO2022] (1,2)

Coralia Cartis, Lindon Roberts and Oliver Sheridan-Methven, Escaping local minima with derivative-free methods: a numerical investigation, Optimization, 71:8 (2022), pp. 2343-2373. [arXiv preprint: 1812.11343]

[R2024]

Lindon Roberts, Model Construction for Convex-Constrained Derivative-Free Optimization, arXiv preprint arXiv:2403.14960 (2024).

[B2017]

Amir Beck, First-Order Methods in Optimization, SIAM (2017).