About: DiffSharp is a functional automatic differentiation (AD) library providing gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products as higher-order functions. It allows exact and efficient calculation of derivatives, with support for nesting. Changes:Fixed: Bug fix in forward AD implementation of Sigmoid and ReLU for D, DV, and DM (fixes #16, thank you @mrakgr) Improvement: Performance improvement by removing several more Parallel.For and Array.Parallel.map operations, working better with OpenBLAS multithreading Added: Operations involving incompatible dimensions of DV and DM will now throw exceptions for warning the user
|
About: Lua-MapReduce framework implemented in Lua using luamongo driver and MongoDB as storage. It follows Iterative MapReduce for training of Machine Learning statistical models. Changes:
|
About: BayesOpt is an efficient, C++ implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design and stochastic bandits. In the literature it is also called Sequential Kriging Optimization (SKO) or Efficient Global Optimization (EGO). There are also interfaces for C, Matlab/Octave and Python. Changes:-Fixed bug in save/restore. -Fixed bug in initial design.
|
About: Hype is a proof-of-concept deep learning library, where you can perform optimization on compositional machine learning systems of many components, even when such components themselves internally perform optimization. Changes:Initial Announcement on mloss.org.
|
About: Efficient and Flexible Distributed/Mobile Deep Learning Framework, for python, R, Julia and more Changes:This version comes with Distributed and Mobile Examples
|
About: Optunity is a library containing various optimizers for hyperparameter tuning. Hyperparameter tuning is a recurrent problem in many machine learning tasks, both supervised and unsupervised.This package provides several distinct approaches to solve such problems including some helpful facilities such as cross-validation and a plethora of score functions. Changes:This minor release has the same feature set as Optunity 1.1.0, but incorporates several bug fixes, mostly related to the specification of structured search spaces.
|
About: A platform-independent C++ framework for machine learning, graphical models, and computer vision research and development. Changes:Version 1.9:
|
About: Presage is an intelligent predictive text entry platform. Changes:Initial Announcement on mloss.org.
|
About: FsAlg is a linear algebra library that supports generic types. Changes:Initial Announcement on mloss.org.
|
About: A Theano framework for building and training neural networks Changes:Initial Announcement on mloss.org.
|
About: The fertilized forests project has the aim to provide an easy to use, easy to extend, yet fast library for decision forests. It summarizes the research in this field and provides a solid platform to extend it. Offering consistent interfaces to C++, Python and Matlab and being available for all major compilers gives the user high flexibility for using the library. Changes:Initial Announcement on mloss.org.
|
About: rabit (Reliable Allreduce and Broadcast Interface) is a light weight library that provides a fault tolerant interface of Allreduce and Broadcast for portable , scalable and reliable distributed machine learning programs. Rabit programs can run on various platforms such as Hadoop, MPI and no installation is needed. Rabit now support kmeans clustering, and distributed xgboost: an extremely efficient disrtibuted boosted tree(GBDT) toolkit. Changes:Initial Announcement on mloss.org.
|
About: It is a Scala library for building Bayesian Networks with discrete/continuous variables and running deterministic Bayesian inference Changes:Initial Announcement on mloss.org.
|
About: C++ generic programming tools for machine learning Changes:Initial Announcement on mloss.org.
|
About: Caffe aims to provide computer vision scientists with a clean, modifiable implementation of state-of-the-art deep learning algorithms. We believe that Caffe is the fastest available GPU CNN implementation. Caffe also provides seamless switching between CPU and GPU, which allows one to train models with fast GPUs and then deploy them on non-GPU clusters. Even in CPU mode, computing predictions on an image takes only 20 ms (in batch mode). Changes:LOTS of stuff: https://github.com/BVLC/caffe/releases/tag/v0.9999
|
About: ARTOS can be used to quickly learn models for visual object detection without having to collect a set of samples manually. To make this possible, it uses ImageNet, a large image database with more than 20,000 categories. Changes:Initial Announcement on mloss.org.
|
About: PyStruct is a framework for learning structured prediction in Python. It has a modular interface, similar to the well-known SVMstruct. Apart from learning algorithms it also contains model formulations for popular CRFs and interfaces to many inference algorithm implementation. Changes:Initial Announcement on mloss.org.
|
About: Efficient implementation of Semi-Stochastic Gradient Descent algorithm (S2GD) for training logistic regression (L2-regularized). Changes:Initial Announcement on mloss.org.
|
About: This package implements Ideal PCA in MATLAB. Ideal PCA is a (cross-)kernel based feature extraction algorithm which is (a) a faster alternative to kernel PCA and (b) a method to learn data manifold certifying features. Changes:Initial Announcement on mloss.org.
|
About: GPU-accelerated java deep neural networks Changes:Initial Announcement on mloss.org.
|