About: Somoclu is a massively parallel implementation of self-organizing maps. It relies on OpenMP for multicore execution, MPI for distributing the workload, and it can be accelerated by CUDA on a GPU cluster. A sparse kernel is also included, which is useful for training maps on vector spaces generated in text mining processes. Apart from a command line interface, Python, Julia, R, and MATLAB are supported. Changes:
|
About: DataDeps is a package for simplifying the management of data in your julia application. In particular it is designed to make getting static data from some server into the local machine, and making programs know where that data is trivial. Changes:Initial Announcement on mloss.org.
|
About: LogReg-Crowds is a collection of Julia implementations of various approaches for learning a logistic regression model multiple annotators and crowds, namely the works of Raykar et al. (2010), Rodrigues et al. (2013) and Dawid and Skene (1979). Changes:Initial Announcement on mloss.org. Added GitHub page.
|
About: Efficient and Flexible Distributed/Mobile Deep Learning Framework, for python, R, Julia and more Changes:This version comes with Distributed and Mobile Examples
|
About: SALSA (Software lab for Advanced machine Learning with Stochastic Algorithms) is an implementation of the well-known stochastic algorithms for Machine Learning developed in the high-level technical computing language Julia. The SALSA software package is designed to address challenges in sparse linear modelling, linear and non-linear Support Vector Machines applied to large data samples with user-centric and user-friendly emphasis. Changes:Initial Announcement on mloss.org.
|
About: xgboost: eXtreme Gradient Boosting It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithm. The package can automatically do parallel computation with OpenMP, and it can be more than 10 times faster than existing gradient boosting packages such as gbm or sklearn.GBM . It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that user are also allowed to define there own objectives easily. The newest version of xgboost now supports distributed learning on various platforms such as hadoop, mpi and scales to even larger problems Changes:
|