Project details for BayesOpt, a Bayesian Optimization toolbox

Screenshot BayesOpt, a Bayesian Optimization toolbox 0.8.2

by rmcantin - December 9, 2015, 04:53:31 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ]

view (13 today), download ( 1 today ), 0 subscriptions

Description:

BayesOpt is an efficient, C++ implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design and stochastic bandits. In the literature it is also called Sequential Kriging Optimization (SKO), Sequential Model Based Optimization (SMBO) or Efficient Global Optimization (EGO).

There are also interfaces for C, Matlab/Octave and Python. The online HTML version of the documentation is in: http://rmcantin.bitbucket.org/html/

Bayesian optimization uses a distribution over functions to build a metamodel of the unknown function for we are looking the extrema, and then apply some active learning strategy to select the query points that provides most potential interest for the seek. For that reason, it has been traditionally intended for optimization of expensive function. However, the efficiency of the library make it also interesting for many types of functions. It is intended to be both fast and clear for development and research. At the same time, it does everything the "right way". For example:

  • there are different methods that can be used for the preliminary design step,
  • extensive use of Cholesky decomposition and related techniques to improve numeral stability and reduce computational cost,
  • kernels, criteria and parametric functions can be combined to produce more advanced functions, etc.

Originally, it was developed for as part of a robotics research project, where a Gaussian process with hyperpriors on the mean and signal covariance parameters. Then, the metamodel was constructed using the Maximum a Posteriory (MAP) of the parameters. However, the library now has grown to support many more surrogate models, with different distributions (Gaussian processes, Student's-t processes, etc.), with many kernels and mean functions. It also provides different criteria (even some combined criteria) so the library can be used to any problem involving some bounded optimization, stochastic bandits, active learning for regression, etc.

You can also find more details in the project webpage: http://rmcantin.bitbucket.org/html/

Changes to previous version:

-Fixed bug in save/restore. -Fixed bug in initial design.

BibTeX Entry: Download
Corresponding Paper BibTeX Entry: Download
Supported Operating Systems: Linux, Windows, Macos
Data Formats: Agnostic
Tags: Active Learning, Optimization, Experimental Design, Gaussian Process, Bandits, Bayesian Optimization
Archive: download here

Other available revisons

Version Changelog Date
0.8.2

-Fixed bug in save/restore. -Fixed bug in initial design.

December 9, 2015, 04:53:31
0.8

-New save/restore functionality.

-Adding method to optimize standalone programs/processes via CLI or XML.

-Improved C++ API with new Parameters class.

-Cleaned API and tree structure.

-New demos/examples.

-Fixed bugs.

-Extended documentation.

-New license (Affero GPL v3)

December 5, 2015, 01:29:11
0.7.2

-Fixed bugs and doc typos

October 10, 2014, 19:12:59
0.7.1

-Added MI criterion

-Simplified Python install

-Fixed bugs in annealed criteria

July 3, 2014, 00:30:50
0.7

-Refactoring to organize the code. -MCMC learning for hyperparameter. -Simpler and uniform interface. -New configuration parameters (random jump, random seed...) -New and improved demos. -Fixed error codes/exceptions in Matlab and Python -Extended documentation. -Code optimization. -Bug fixes.

May 20, 2014, 23:38:55
0.6

-Complete refactoring of inner parts of the library. The code is easier to understand/modify and it allow simpler integration with new algorithms.

-Updated to the latest version of NLOPT (2.4.1). Wrapper code symplified.

-Error codes replaced with exceptions in C++ interface. Library is exception safe.

-API modified to support new learning methods for kernel hyperparameters (e.g: MCMC). Warning: config parameters about learning have changed. Code using previous versions might not work. Some of the learning methods (like MCMC) are not yet implemented.

-Added configuration of random numbers (can be fixed for debugging). Fixed issue with random numbers using different sources or random number with potential correlations. Now all the elements are guaranteed to use the same instance of the random engine.

-Improved numerical results (e.g.: hyperparameter optimization is done in log space)

-More examples and tests.

-Fixed bugs.

-The number of inner iterations have been increased by default, so overall optimization time using default configuration might be slower, but with improved results.

March 26, 2014, 17:48:17
0.5.1

-Fixed bugs.

-Extended error handling and notifications.

-Improved numerical stability of Hedging algorithms.

August 23, 2013, 18:49:36
0.5

-Fixed bugs.

-Improved and extended documentation.

-Simplified installation. Separate Python module.

-New demos and examples.

-REMBO algorithm (Bayesian optimization in high dimensions through random embeding)

-Support for Sobol sequences for initial design.

-Support for Matlab in Windows using MinGW (Support for Visual Studio was already available)

July 26, 2013, 00:51:50
0.4.1

-Fixed bugs.

-Improved and extended documentation.

-Extended and simplified API accross platforms.

-Extended functionality (new surrogate functions, new priors, new kernels, new criteria).

-Improved modularity of the optimization process to allow plotting and debugging of intermediate steps.

-Added more demos and examples.

May 15, 2013, 19:36:40
0.3

Initial Announcement on mloss.org.

April 1, 2013, 03:46:21

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.