About: BayesOpt is an efficient, C++ implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design and stochastic bandits. In the literature it is also called Sequential Kriging Optimization (SKO) or Efficient Global Optimization (EGO). There are also interfaces for C, Matlab/Octave and Python.Changes:
-Complete refactoring of inner parts of the library. The code is easier to understand/modify and it allow simpler integration with new algorithms.
-Updated to the latest version of NLOPT (2.4.1). Wrapper code symplified.
-Error codes replaced with exceptions in C++ interface. Library is exception safe.
-API modified to support new learning methods for kernel hyperparameters (e.g: MCMC). Warning: config parameters about learning have changed. Code using previous versions might not work. Some of the learning methods (like MCMC) are not yet implemented.
-Added configuration of random numbers (can be fixed for debugging). Fixed issue with random numbers using different sources or random number with potential correlations. Now all the elements are guaranteed to use the same instance of the random engine.
-Improved numerical results (e.g.: hyperparameter optimization is done in log space)
-More examples and tests.
-The number of inner iterations have been increased by default, so overall optimization time using default configuration might be slower, but with improved results.