DFO-LS: Derivative-Free Optimizer for Least-Squares Minimization
Release: 1.5.3
Date: 30 October 2024
Author: Lindon Roberts
DFO-LS is a flexible package for finding local solutions to nonlinear least-squares minimization problems (with optional regularizer and constraints), without requiring any derivatives of the objective. DFO-LS stands for Derivative-Free Optimizer for Least-Squares.
That is, DFO-LS solves
The optional regularizer \(h(x)\) is a Lipschitz continuous and convex, but possibly non-differentiable function that is typically used to avoid overfitting. A common choice is \(h(x)=\lambda \|x\|_1\) (called L1 regularization or LASSO) for \(\lambda>0\). Note that in the case of Tikhonov regularization/ridge regression, \(h(x)=\lambda\|x\|_2^2\) is not Lipschitz continuous, so should instead be incorporated by adding an extra term into the least-squares sum, \(r_{m+1}(x)=\sqrt{\lambda} \|x\|_2\). The (optional) constraint set \(C\) is the intersection of multiple convex sets provided as input by the user. All constraints are non-relaxable (i.e. DFO-LS will never ask to evaluate a point that is not feasible), although the general constraints \(x\in C\) may be slightly violated from rounding errors.
Full details of the DFO-LS algorithm are given in our papers:
Cartis, J. Fiala, B. Marteau and L. Roberts, Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers, ACM Transactions on Mathematical Software, 45:3 (2019), pp. 32:1-32:41 [preprint] .
Hough, and L. Roberts, Model-Based Derivative-Free Methods for Convex-Constrained Optimization, SIAM Journal on Optimization, 21:4 (2022), pp. 2552-2579 [preprint].
Liu, K. H. Lam and L. Roberts, Black-box Optimization Algorithms for Regularized Least-squares Problems, arXiv preprint arXiv:arXiv:2407.14915, 2024.
DFO-LS is a more flexible version of DFO-GN.
If you are interested in solving general optimization problems (without a least-squares structure), you may wish to try Py-BOBYQA, which has many of the same features as DFO-LS.
DFO-LS is released under the GNU General Public License. Please contact NAG for alternative licensing.
- Installing DFO-LS
- Overview
- Using DFO-LS
- Nonlinear Least-Squares Minimization
- How to use DFO-LS
- Optional Arguments
- A Simple Example
- Adding Bounds and More Output
- Adding General Convex Constraints
- Adding a Regularizer
- Example: Noisy Objective Evaluation
- Example: Parameter Estimation/Data Fitting
- Example: Solving a Nonlinear System of Equations
- References
- Advanced Usage
- General Algorithm Parameters
- Logging and Output
- Initialization of Points
- Trust Region Management
- Termination on Small Objective Value
- Termination on Slow Progress
- Stochastic Noise Information
- Interpolation Management
- Regression Model Management
- Multiple Restarts
- Dynamically Growing Initial Set
- Dykstra’s Algorithm
- Checking Matrix Rank
- Handling regularizer
- References
- Diagnostic Information
- Version History
- Version 1.0 (6 Feb 2018)
- Version 1.0.1 (20 Feb 2018)
- Version 1.0.2 (20 Jun 2018)
- Version 1.1 (16 Jan 2019)
- Version 1.1.1 (5 Apr 2019)
- Version 1.2 (12 Feb 2020)
- Version 1.2.1 (13 Feb 2020)
- Version 1.2.2 (26 Feb 2021)
- Version 1.2.3 (1 Jun 2021)
- Version 1.3.0 (8 Nov 2021)
- Version 1.4.0 (29 Jan 2024)
- Version 1.4.1 (11 Apr 2024)
- Version 1.5.0 (11 Sep 2024)
- Version 1.5.1 (10 Oct 2024)
- Version 1.5.2 (28 Oct 2024)
- Version 1.5.3 (30 Oct 2024)
- Contributors
Acknowledgements
This software was initially developed under the supervision of Coralia Cartis, and was supported by the EPSRC Centre For Doctoral Training in Industrially Focused Mathematical Modelling (EP/L015803/1) in collaboration with the Numerical Algorithms Group. Development of DFO-LS has also been supported by the Australian Research Council (DE240100006).