A scalable, fast C++ machine learning library, with emphasis on usability.
GMM initialization is now safer and provides a working GMM when constructed with only the dimensionality and number of Gaussians (#314).
Check for division by 0 in Forward-Backward Algorithm in HMMs (#314).
Fix MaxVarianceNewCluster (used when re-initializing clusters for k-means) (#314).
Fixed implementation of Viterbi algorithm in HMM::Predict() (#316).
Significant speedups for dual-tree algorithms using the cover tree (#243, #329) including a faster implementation of FastMKS.
Fix for LRSDP optimizer so that it compiles and can be used (#325).
CF (collaborative filtering) now expects users and items to be zero-indexed, not one-indexed (#324).
CF::GetRecommendations() API change: now requires the number of recommendations as the first parameter. The number of users in the local neighborhood should be specified with CF::NumUsersForSimilarity().
Removed incorrect PeriodicHRectBound (#30).
Refactor LRSDP into LRSDP class and standalone function to be optimized (#318).
Fix for centering in kernel PCA (#355).
Added simulated annealing (SA) optimizer, contributed by Zhihao Lou.
HMMs now support initial state probabilities; these can be set in the constructor, trained, or set manually with HMM::Initial() (#315).
Added Nyström method for kernel matrix approximation by Marcus Edel.
Kernel PCA now supports using Nyström method for approximation.
Ball trees now work with dual-tree algorithms, via the BallBound<> bound structure (#320); fixed by Yash Vadalia.
The NMF class is now AMF<>, and supports far more types of factorizations, by Sumedh Ghaisas.
A QUIC-SVD implementation has returned, written by Siddharth Agrawal and based on older code from Mudit Gupta.
Added perceptron and decision stump by Udit Saxena (these are weak learners for an eventual AdaBoost class).
Sparse autoencoder added by Siddharth Agrawal.
- Programming Language:
- Operating System:
- Data Formats:
- JMLR-MLOSS Publication: