About: A Machine Learning framework for Objective-C and Swift (OS X / iOS) Changes:Initial Announcement on mloss.org.
|
About: The Java package jLDADMM is released to provide alternative choices for topic modeling on normal or short texts. It provides implementations of the Latent Dirichlet Allocation topic model and the one-topic-per-document Dirichlet Multinomial Mixture model (i.e. mixture of unigrams), using collapsed Gibbs sampling. In addition, jLDADMM supplies a document clustering evaluation to compare topic models. Changes:Initial Announcement on mloss.org.
|
About: Presage is an intelligent predictive text entry platform. Changes:Initial Announcement on mloss.org.
|
About: libnabo is a fast K Nearset Neighbor library for low-dimensional spaces. Changes:
|
About: The Universal Java Matrix Package (UJMP) is a data processing tool for Java. Unlike JAMA and Colt, it supports multi-threading and is therefore much faster on current hardware. It does not only support matrices with double values, but instead handles every type of data as a matrix through a common interface, e.g. CSV files, Excel files, images, WAVE audio files, tables in SQL data bases, and much more. Changes:Updated to version 0.3.0
|
About: Rival is an open source Java toolkit for recommender system evaluation. It provides a simple way to create evaluation results comparable across different recommendation frameworks. Changes:Initial Announcement on mloss.org.
|
About: libDAI provides free & open source implementations of various (approximate) inference methods for graphical models with discrete variables, including Bayesian networks and Markov Random Fields. Changes:Release 0.3.2 fixes various bugs and adds GLC (Generalized Loop Corrections) written by Siamak Ravanbakhsh.
|
About: Data Sets, Functions and Examples from the Book Changes:Fetched by r-cran-robot on 2018-01-01 00:00:07.925283
|
About: Recur is a collection of Gstreamer plugins and language modelling tools based on recurrent neural networks. Changes:Initial Announcement on mloss.org.
|
About: R package implementing statistical test and post hoc tests to compare multiple algorithms in multiple problems. Changes:Initial Announcement on mloss.org.
|
About: Simple and hopefully clean and easy to follow implementation of the Generalized Learning Vector Quantizer (GLVQ) with variants for metric adaptation (RGLVQ, GMLVQ, LiRaM). Changes:Initial Announcement on mloss.org.
|
About: A Deep Learning API and server Changes:Initial Announcement on mloss.org.
|
About: Learning M-Way Tree - Web Scale Clustering - EM-tree, K-tree, k-means, TSVQ, repeated k-means, clustering, random projections, random indexing, hashing, bit signatures Changes:Initial Announcement on mloss.org.
|
About: Incremental (Online) Nonparametric Classifier. You can classify both points (standard) or matrices (multivariate time series). Java and Matlab code already available. Changes:version 2: parameterless system, constant model size, prediction confidence (for active learning). NEW!! C++ version at: https://github.com/ilaria-gori/ABACOC
|
About: Jie Gui et al., "How to estimate the regularization parameter for spectral regression discriminant analysis and its kernel version?", IEEE Transactions on Circuits and Systems for Video Technology, vol. 24, no. 2, pp. 211-223, 2014 Changes:Initial Announcement on mloss.org. |
About: Jie Gui, Zhenan Sun, Guangqi Hou, Tieniu Tan, "An optimal set of code words and correntropy for rotated least squares regression", International Joint Conference on Biometrics, 2014, pp. 1-6 Changes:Initial Announcement on mloss.org.
|
About: Cluster quality Evaluation software. Implements cluster quality metrics based on ground truths such as Purity, Entropy, Negentropy, F1 and NMI. It includes a novel approach to correct for pathological or ineffective clusterings called 'Divergence from a Random Baseline'. Changes:Moved project to GitHub.
|
About: Learning string edit distance / similarity from data Changes:Added datasets used in the experiments of the paper
|
About: xgboost: eXtreme Gradient Boosting It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithm. The package can automatically do parallel computation with OpenMP, and it can be more than 10 times faster than existing gradient boosting packages such as gbm or sklearn.GBM . It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that user are also allowed to define there own objectives easily. The newest version of xgboost now supports distributed learning on various platforms such as hadoop, mpi and scales to even larger problems Changes:
|