About: Code for Calibrated AdaMEC for binary cost-sensitive classification. The method is just AdaBoost that properly calibrates its probability estimates and uses a cost-sensitive (i.e. risk-minimizing) decision threshold to classify new data. Changes:Updated license information
|
About: A Python based library for running experiments with Deep Learning and Ensembles on GPUs. Changes:Initial Announcement on mloss.org.
|
About: Script-friendly command-line tools for machine learning and data mining tasks. (The command-line tools wrap functionality from a public domain C++ class library.) Changes:Added support for CUDA GPU-parallelized neural network layers, and several other new features. Full list of changes at http://waffles.sourceforge.net/docs/changelog.html
|
About: Use the power of crowdsourcing to create ensembles. Changes:Initial Announcement on mloss.org.
|
About: Survival forests: Random Forests variant for survival analysis. Original implementation by Leo Breiman. Changes:Initial Announcement on mloss.org.
|
About: Regression forests, Random Forests for regression. Original implementation by Leo Breiman. Changes:Initial Announcement on mloss.org.
|
About: The original Random Forests implementation by Breiman and Cutler. Changes:Initial Announcement on mloss.org.
|
About: Itemset boosting (iBoost) performs linear regression in the complete space of power sets of mutations. It implements a forward feature selection procedure where, in each iteration, one mutation [...] Changes:Initial Announcement on mloss.org.
|
About: RapidMiner (formerly YALE) is one of the most widely used open-source data mining suites and software solutions due to its leading-edge technologies and its functional range. Applications of [...] Changes:Initial Announcement on mloss.org.
|
About: Torch is a statistical machine learning library written in C++ at IDIAP, Changes:Initial Announcement on mloss.org.
|