Project details for minFunc

Logo minFunc 2012

by markSchmidt - December 18, 2013, 01:07:07 CET [ Project Homepage BibTeX Download ]

view (8 today), download ( 0 today ), 1 subscription

Description:

minFunc is a Matlab function for unconstrained optimization of differentiable real-valued multivariate functions using line-search methods. It uses an interface very similar to the Matlab Optimization Toolbox function fminunc, and can be called as a replacement for this function. On many problems, minFunc requires fewer function evaluations to converge than fminunc (or minimize.m). Further it can optimize problems with a much larger number of variables (fminunc is restricted to several thousand variables), and uses a line search that is robust to several common function pathologies.

The default parameters of minFunc call a quasi-Newton strategy, where limited-memory BFGS updates with Shanno-Phua scaling are used in computing the step direction, and a bracketing line-search for a point satisfying the strong Wolfe conditions is used to compute the step direction. In the line search, (safeguarded) cubic interpolation is used to generate trial values, and the method switches to an Armijo back-tracking line search on iterations where the objective function enters a region where the parameters do not produce a real valued output (i.e. complex, NaN, or Inf).

Some highlights of the non-default features present in minFunc:

Step directions can be computed based on: Exact Newton (requires user-supplied Hessian), full quasi-Newton approximation (uses a dense Hessian approximation), limited-memory BFGS (uses a low-rank Hessian approximation - default), (preconditioned) Hessian-free Newton (uses Hessian-vector products), (preconditioned) conjugate gradient (uses only previous step and a vector beta), Barzilai and Borwein (uses only previous step), or (cyclic) steepest descent.
Step lengths can be computed based on either the (non-monotone) Armijo or Wolfe conditions, and trial values can be generated by either backtracking/bisection, or polynomial interpolation. Several strategies are available for selecting the initial trial value.
Numerical differentiation and derivative checking are available, including an option for automatic differentiation using complex-step differentials (if the objective function code handles complex inputs).
Most methods have user-modifiable parameters, such as the number of corrections to store for L-BFGS, modification options for Hessian matrices that are not positive-definite in the pure Newton method, choice of preconditioning and Hessian-vector product functions for the Hessian-free Newton method, choice of update method;scaling;preconditioning for the non-linear conjugate gradient method, the type of Hessian approximation to use in the quasi-Newton iteration, number of steps to look back for the non-monotone Armijo condition, the parameters of the line search algorithm, the parameters of the termination criteria, etc.

Usage minFunc uses an interface very similar to Matlab's fminunc. If you currently call 'fminunc(@myFunc,x0,options,myFuncArg1,myFuncArg2)', then you can use minFunc instead by simply replacing 'fminunc' with 'minFunc'. Note that by default minFunc assumes that the gradient is supplied, unless the 'numDiff' option is set to 1 (for forward-differencing) or 2 (for central-differencing). minFunc supports many of the same parameters as fminunc (but not all), but has some differences in naming and also has many parameters that are not available for fminunc. 'help minFunc' will give a list of parameters and their explanation.

Changes to previous version:

Initial Announcement on mloss.org.

BibTeX Entry: Download
URL: Project Homepage
Supported Operating Systems: Platform Independent
Data Formats: Matlab
Tags: Optimization
Archive: download here

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.