About: ELKI is a framework for implementing datamining algorithms with support for index structures, that includes a wide variety of clustering and outlier detection methods. Changes:Additions and Improvements from ELKI 0.6.0: ELKI is now available on Maven: https://search.maven.org/#artifactdetailsde.lmu.ifi.dbs.elkielki0.7.0jar Please clone https://github.com/elkiproject/exampleelkiproject for a minimal project example. Uncertain data types, and clustering algorithms for uncertain data. Major refactoring of distances  removal of Distance values and removed support for nondoublevalued distance functions (in particular DoubleDistance was removed). While this reduces the generality of ELKI, we could remove about 2.5% of the codebase by not having to have optimized codepaths for doubledistance anymore. Generics for distances were present in almost any distancebased algorithm, and we were also happy to reduce the use of generics this way. Support for nondoublevalued distances can trivially be added again, e.g. by adding the specialization one level higher: at the query instead of the distance level, for example. In this process, we also removed the Generics from NumberVector. The objectbased get was deprecated for a good reason long ago, and e.g. doubleValue are more efficient (even for nonDoubleVectors). Dropped some longdeprecated classes. Kmeans:
CLARA clustering. Xmeans. Hierarchical clustering:
LSDBC clustering. EM clustering was refactored and moved into its own package. The new version is much more extensible. OPTICS clustering:
Outlier detection:
Parallel computation framework, and some parallelized algorithms
LibSVM format parser. kNN classification (with index acceleration). Internal cluster evaluation:
Statistical dependence measures:
Distance functions:
Preprocessing:
Indexing improvements:
Frequent Itemset Mining:
Uncertain clustering:
Mathematics:
MiniGUI has two "secret" new options: minigui.last minigui.autorun to load the last saved configuration and run it, for convenience. Logging API has been extended, to make logging more convenient in a number of places (saving some lines for progress logging and timing).

About: Hubnessaware Machine Learning for Highdimensional Data Changes:

About: DDN learns and visualize differential dependency networks from conditionspecific data. Changes:Initial Announcement on mloss.org.

About: MLDemos is a userfriendly visualization interface for various machine learning algorithms for classification, regression, clustering, projection, dynamical systems, reward maximisation and reinforcement learning. Changes:New Visualization and Dataset Features Added 3D visualization of samples and classification, regression and maximization results Added Visualization panel with individual plots, correlations, density, etc. Added Editing tools to drag/magnet data, change class, increase or decrease dimensions of the dataset Added categorical dimensions (indexed dimensions with nonnumerical values) Added Dataset Editing panel to swap, delete and rename dimensions, classes or categorical values Several bugfixes for display, import/export of data, classification performance New Algorithms and methodologies Added Projections to preprocess data (which can then be classified/regressed/clustered), with LDA, PCA, KernelPCA, ICA, CCA Added GridSearch panel for batchtesting ranges of values for up to two parameters at a time Added OnevsAll multiclass classification for nonmulticlass algorithms Trained models can now be kept and tested on new data (training on one dataset, testing on another) Added a dataset generator panel for standard toy datasets (e.g. swissroll, checkerboard,...) Added a number of clustering, regression and classification algorithms (FLAME, DBSCAN, LOWESS, CCA, KMEANS++, GP Classification, Random Forests) Added Save/Load Model option for GMMs and SVMs Added Growing Hierarchical Self Organizing Maps (original code by Michael Dittenbach) Added Automatic Relevance Determination for SVM with RBF kernel (Thanks to Ashwini Shukla!)

About: Orange is a componentbased machine learning and data mining software. It includes a friendly yet powerful and flexible graphical user interface for visual programming. For more advanced use(r)s, [...] Changes:The core of the system (except the GUI) no longer includes any GPL code and can be licensed under the terms of BSD upon request. The graphical part remains under GPL. Changed the BibTeX reference to the paper recently published in JMLR MLOSS.

About: Divvy is a Mac OS X application for performing dimensionality reduction, clustering, and visualization. Changes:Initial Announcement on mloss.org.

About: MLPlot is a lightweight plotting library written in Java. Changes:Initial Announcement on mloss.org.

About: The Delay vector variance (DVV) method uses predictability of the signal in phase space to characterize the time series. Using the surrogate data methodology, so called DVV plots and DVV scatter [...] Changes:Initial Announcement on mloss.org.
