About: This is an implementation of variational Dirichlet process Gaussian mixtures. Thus, this works like the k-means, but it searched for the number of clusters as well. Couple algorithms are [...] Changes:Initial Announcement on mloss.org.
|
About: Matlab code for performing variational inference in the Indian Buffet Process with a linear-Gaussian likelihood model. Changes:Initial Announcement on mloss.org.
|
About: The VLFeat open source library implements popular computer vision algorithms including affine covariant feature detectors, HOG, SIFT, MSER, k-means, hierarchical k-means, agglomerative information bottleneck, SLIC superpixels, and quick shift. It is written in C for efficiency and compatibility, with interfaces in MATLAB for ease of use, and detailed documentation throughout. It supports Windows, Mac OS X, and Linux. The latest version of VLFeat is 0.9.16. Changes:VLFeat 0.9.16: Added VL_COVDET() (covariant feature detectors). This function implements the following detectors: DoG, Hessian, Harris Laplace, Hessian Laplace, Multiscale Hessian, Multiscale Harris. It also implements affine adaptation, estiamtion of feature orientation, computation of descriptors on the affine patches (including raw patches), and sourcing of custom feature frame. Addet the auxiliary function VL_PLOTSS(). This is the second point update supported by the PASCAL Harvest programme. VLFeat 0.9.15: Added VL_HOG() (HOG features). Added VL_SVMPEGASOS() and a vastly improved SVM implementation. Added IHASHSUM (hashed counting). Improved INTHIST (integral histogram). Added VL_CUMMAX(). Improved the implementation of VL_ROC() and VL_PR(). Added VL_DET() (Detection Error Trade-off (DET) curves). Improved the verbosity control to AIB. Added support for Xcode 4.3, improved support for past and future Xcode versions. Completed the migration of the old test code in toolbox/test, moving the functionality to the new unit tests toolbox/xtest. Improved credits. This is the first point update supported by the PASCAL Harvest (several more to come shortly).
|
About: This is a large scale online learning implementation with several useful features. See the webpage for more details. Changes:Initial Announcement on mloss.org.
|
About: Script-friendly command-line tools for machine learning and data mining tasks. (The command-line tools wrap functionality from a public domain C++ class library.) Changes:Added support for CUDA GPU-parallelized neural network layers, and several other new features. Full list of changes at http://waffles.sourceforge.net/docs/changelog.html
|
About: Use the power of crowdsourcing to create ensembles. Changes:Initial Announcement on mloss.org.
|
About: The Weka workbench contains a collection of visualization tools and algorithms for data analysis and predictive modelling, together with graphical user interfaces for easy access to this [...] Changes:This release include a lot of bug fixes and improvements. Some of these are detailed at http://jira.pentaho.com/projects/DATAMINING/issues/DATAMINING-771 As usual, for a complete list of changes refer to the changelogs.
|
About: This is a library for solving nu-SVM by using Wolfe's minimum norm point algorithm. You can solve binary classification problem. Changes:Initial Announcement on mloss.org.
|
About: This is a Perl module that implements a variety of semantic similarity and relatedness measures based on information found in the lexical database WordNet. In particular, it supports the measures of [...] Changes:Initial Announcement on mloss.org.
|
About: xgboost: eXtreme Gradient Boosting It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithm. The package can automatically do parallel computation with OpenMP, and it can be more than 10 times faster than existing gradient boosting packages such as gbm or sklearn.GBM . It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that user are also allowed to define there own objectives easily. The newest version of xgboost now supports distributed learning on various platforms such as hadoop, mpi and scales to even larger problems Changes:
|
About: Stochastic neighbor embedding originally aims at the reconstruction of given distance relations in a low-dimensional Euclidean space. This can be regarded as general approach to multi-dimensional scaling, but the reconstruction is based on the definition of input (and output) neighborhood probability alone. The present implementation also allows for handling dissimilarity or score-induced neighborhood topologies and makes use of quasi 2nd order gradient-based (l-)BFGS optimization. Changes:
|
About: yaplf (Yet Another Python Learning Framework) is an extensible machine learning framework written in python Changes:Initial Announcement on mloss.org.
|
About: A Machine Learning framework for Objective-C and Swift (OS X / iOS) Changes:Initial Announcement on mloss.org.
|