Project details for BayesOpt, a Bayesian Optimization toolbox

Logo BayesOpt, a Bayesian Optimization toolbox 0.3

by rmcantin - April 1, 2013, 03:46:21 CET [ Project Homepage BibTeX Download ]

view (8 today), download ( 1 today ), 4 subscriptions

Description:

This is an efficient, C++ implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design and stochastic bandits. In the literature it is also called Sequential Kriging Optimization (SKO) or Efficient Global Optimization (EGO).

There are also interfaces for C, Matlab/Octave and Python. The online HTML version of the documentation is in: http://rmcantin.bitbucket.org/html/

Basically, it uses a distribution over functions to build a metamodel of the unknown function for we are looking the extrema, and then apply some active learning strategy to select the query points that provides most potential interest for the seek. For that reason, it has been traditionally intended for optimization of expensive function. However, the efficiency of the library make it also interesting for many types of functions.

It is intended to be both fast and clear for development and research. At the same time, it does everything the "right way". For example:

  • latin hypercube sampling is use for the preliminary sampling step

  • kernel parameters are trained with the preliminary samples and fixed afterwards to avoid bias and divergence

  • matrix algebra tricks are used to guarantee that any covariance matrix remains SPD and reduce computational cost.

  • etc.

Originally, it was developed for as part of a robotics research project, where a Gaussian process with hyperpriors on the mean and signal covariance parameters. Then, the metamodel was constructed using the Maximum a Posteriory (MAP) of the parameters. However, the library now has grown to support many more surrogate models, with different distributions (Gaussian processes, Student's-t processes, etc.), with many kernels and mean functions. It also provides different criteria (even some combined criteria) so the library can be used to any problem involving some bounded optimization, stochastic bandits, active learning for regression, etc.

You can also find more details at: http://bitbucket.org/rmcantin/bayesian-optimization/wiki/Home

Changes to previous version:

Initial Announcement on mloss.org.

BibTeX Entry: Download
URL: Project Homepage
Supported Operating Systems: Linux, Windows, Macos
Data Formats: Agnostic
Tags: Active Learning, Optimization, Experimental Design, Gaussian Process, Bandits, Bayesian Optimization
Archive: download here

Other available revisons

Version Changelog Date
0.7.2

-Fixed bugs and doc typos

October 10, 2014, 19:12:59
0.7.1

-Added MI criterion

-Simplified Python install

-Fixed bugs in annealed criteria

July 3, 2014, 00:30:50
0.7

-Refactoring to organize the code. -MCMC learning for hyperparameter. -Simpler and uniform interface. -New configuration parameters (random jump, random seed...) -New and improved demos. -Fixed error codes/exceptions in Matlab and Python -Extended documentation. -Code optimization. -Bug fixes.

May 20, 2014, 23:38:55
0.6

-Complete refactoring of inner parts of the library. The code is easier to understand/modify and it allow simpler integration with new algorithms.

-Updated to the latest version of NLOPT (2.4.1). Wrapper code symplified.

-Error codes replaced with exceptions in C++ interface. Library is exception safe.

-API modified to support new learning methods for kernel hyperparameters (e.g: MCMC). Warning: config parameters about learning have changed. Code using previous versions might not work. Some of the learning methods (like MCMC) are not yet implemented.

-Added configuration of random numbers (can be fixed for debugging). Fixed issue with random numbers using different sources or random number with potential correlations. Now all the elements are guaranteed to use the same instance of the random engine.

-Improved numerical results (e.g.: hyperparameter optimization is done in log space)

-More examples and tests.

-Fixed bugs.

-The number of inner iterations have been increased by default, so overall optimization time using default configuration might be slower, but with improved results.

March 26, 2014, 17:48:17
0.5.1

-Fixed bugs.

-Extended error handling and notifications.

-Improved numerical stability of Hedging algorithms.

August 23, 2013, 18:49:36
0.5

-Fixed bugs.

-Improved and extended documentation.

-Simplified installation. Separate Python module.

-New demos and examples.

-REMBO algorithm (Bayesian optimization in high dimensions through random embeding)

-Support for Sobol sequences for initial design.

-Support for Matlab in Windows using MinGW (Support for Visual Studio was already available)

July 26, 2013, 00:51:50
0.4.1

-Fixed bugs.

-Improved and extended documentation.

-Extended and simplified API accross platforms.

-Extended functionality (new surrogate functions, new priors, new kernels, new criteria).

-Improved modularity of the optimization process to allow plotting and debugging of intermediate steps.

-Added more demos and examples.

May 15, 2013, 19:36:40
0.3

Initial Announcement on mloss.org.

April 1, 2013, 03:46:21

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.