mloss.org new softwarehttp://mloss.orgUpdates and additions to mloss.orgenWed, 14 Mar 2018 21:45:46 -0000seglearn 0.1http://mloss.org/revision/view/2171/<html><p>Seglearn is a python package for machine learning time series or sequences using a sliding window segmentation. It provides an integrated pipeline for segmentation, feature extraction, feature processing, and final estimator. Seglearn provides a flexible approach to multivariate time series and contextual data for classification, regression, and forecasting problems. It is compatible with scikit-learn.
</p></html>David BurnsWed, 14 Mar 2018 21:45:46 -0000http://mloss.org/software/rss/comments/2171http://mloss.org/revision/view/2171/sequenceshuman activity recognitiontime seriesBaycomp 1.0http://mloss.org/revision/view/2170/<html><p>Functions compare two classifiers on one or multiple data sets. They compute three probabilities: the probability that the first classifier has higher scores than the second, the probability that differences are within the region of practical equivalence (rope), or that the second classifier has higher scores.
</p>
<p>The region of practical equivalence is specified by the caller and should correspond to what is "equivalent" in practice; for instance, classification accuracies that differ by less than 0.5 may be called equivalent.
</p>
<p>Similarly, whether higher scores are better or worse depends upon the type of the score.
</p>
<p>The library can also plot the posterior distributions.
</p></html>Janez Demsar, Alessio Benavoli, Giorgio CoraniWed, 07 Mar 2018 01:37:12 -0000http://mloss.org/software/rss/comments/2170http://mloss.org/revision/view/2170/bayesian testsclassifier comparisonSpectra. A Library for Large Scale Eigenvalue Problems 0.6.1http://mloss.org/revision/view/2169/<html><p>Spectra is a C++ library for large scale eigenvalue problems, built on top of Eigen (<a href="http://eigen.tuxfamily.org">http://eigen.tuxfamily.org</a>).
</p>
<p>Spectra is designed to calculate a specified number (k) of eigenvalues of a large square matrix (A). Usually k is much smaller than the size of matrix (n), so that only a few eigenvalues and eigenvectors are computed, which in general is more efficient than calculating the whole spectral decomposition. Users can choose eigenvalue selection rules to pick the eigenvalues of interest, such as the largest k eigenvalues, or eigenvalues with largest real parts, etc.
</p>
<p>Spectra is implemented as a header-only C++ library, whose only dependence, Eigen, is also header-only. Hence Spectra can be easily embedded in C++ projects that require calculating eigenvalues of large matrices.
</p>
<p>Key Features:
</p>
<ul>
<li>
Calculates a small number of eigenvalues/eigenvectors of a large square matrix.
</li>
<li>
Broad application in dimensionality reduction, principal component analysis, community detection, etc.
</li>
<li>
High performance. In most cases faster than ARPACK.
</li>
<li>
Header-only. Easy to be embedded into other projects.
</li>
<li>
Supports symmetric/general, dense/sparse matrices.
</li>
<li>
Elegant and user-friendly API with great flexibility.
</li>
<li>
Convenient and powerful R interface, the RSpectra R package.
</li>
</ul></html>Yixuan QiuSun, 04 Mar 2018 16:18:03 -0000http://mloss.org/software/rss/comments/2169http://mloss.org/revision/view/2169/singular value decompositionprincipal component analysisfactorizationeigenvalueSomoclu 1.7.5http://mloss.org/revision/view/2168/<html><p>Somoclu is a C++ tool for training self-organizing maps on large data sets using a massively parallel resources. It relies on OpenMP for multicore execution and it builds on MPI for distributing the workload across the nodes of the cluster. It is also able to boost training by using CUDA if graphics processing units are available. A sparse kernel is included, which is useful for high-dimensional but sparse data, such as the vector spaces common in text mining workflows. Python, Julia, R, and MATLAB interfaces facilitate use in data analysis. The code is released under GNU GPLv3 licence.
</p>
<p>Key features:
</p>
<ul>
<li><p>Fast execution by parallelization: OpenMP, MPI, and CUDA are supported.
</p>
</li>
<li><p>Python, Julia, R, and MATLAB interfaces for the dense multicore CPU kernel.
</p>
</li>
<li><p>Planar and toroid maps.
</p>
</li>
<li><p>Rectangular and hexagonal grids.
</p>
</li>
<li><p>Gaussian and bubble neighborhood functions.
</p>
</li>
<li><p>Both dense and sparse input data are supported.
</p>
</li>
<li><p>Large emergent maps of several hundred thousand neurons are feasible.
</p>
</li>
<li><p>Integration with Databionic ESOM Tools.
</p>
</li>
</ul></html>Peter Wittek, Shi Chao GaoThu, 01 Mar 2018 23:30:34 -0000http://mloss.org/software/rss/comments/2168http://mloss.org/revision/view/2168/cudaself organizing mapsmpiesomopenmpr-cran-Boruta 5.2.0http://mloss.org/revision/view/2053/<html><p>Wrapper Algorithm for All Relevant Feature Selection: An all relevant feature selection wrapper algorithm. It finds relevant features by comparing original attributes' importance with importance achievable at random, estimated using their permuted copies.
</p></html>Miron Bartosz Kursa [aut, cre], Witold Remigiusz Rudnicki [aut]Thu, 01 Mar 2018 00:00:04 -0000http://mloss.org/software/rss/comments/2053http://mloss.org/revision/view/2053/r-cranMLweb 1.2http://mloss.org/revision/view/2166/<html><p>MLweb is an open source project that aims at bringing machine learning capabilities into web pages and web applications, while maintaining all computations on the client side, i.e., in the browser.
</p>
<p>It includes the following.
</p>
<h2>LALOLib: a javascript library to enable and ease scientific computing within web pages</h2>
<p>LALOLib provides functions for
</p>
<ul>
<li>
linear algebra: basic vector and matrix operations, linear system solvers, matrix factorizations (QR, Cholesky), eigendecomposition, singular value decomposition, conjugate gradient sparse linear system solver, complex numbers/matrices, discrete Fourier transform... ),
</li>
<li>
statistics: sampling from and estimating standard distributions,
</li>
<li>
optimization: steepest descent, BFGS, linear programming (thanks to glpk.js), quadratic programming.
</li>
</ul>
<p>Documentation is available at <a href="http://mlweb.loria.fr/lalolab/lalolib.html">http://mlweb.loria.fr/lalolab/lalolib.html</a>
</p>
<p>See also the benchmark at <a href="http://mlweb.loria.fr/benchmark/">http://mlweb.loria.fr/benchmark/</a>
</p>
<h2>ML.js: a javascript library for machine learning</h2>
<p>In addition to all the functions of LALOLib, ML.js implements the following algorithms.
</p>
<h3>Classification</h3>
<ul>
<li>
K-nearest neighbors,
</li>
<li>
Linear/quadratic discriminant analysis,
</li>
<li>
Naive Bayes classifier,
</li>
<li>
Logistic regression,
</li>
<li>
Perceptron,
</li>
<li>
Multi-layer perceptron,
</li>
<li>
Support vector machines,
</li>
<li>
Multi-class support vector machines,
</li>
<li>
Decision trees
</li>
</ul>
<h3>Regression</h3>
<ul>
<li>
Least squares,
</li>
<li>
Least absolute devations,
</li>
<li>
K-nearest neighbors,
</li>
<li>
Ridge regression,
</li>
<li>
LASSO,
</li>
<li>
LARS,
</li>
<li>
Orthogonal least squares,
</li>
<li>
Multi-layer perceptron,
</li>
<li>
Kernel ridge regression,
</li>
<li>
Support vector regression,
</li>
<li>
K-LinReg
</li>
</ul>
<h3>Clustering</h3>
<ul>
<li>
K-means,
</li>
<li>
Spectral clustering
</li>
</ul>
<h3>Dimensionality reduction</h3>
<ul>
<li>
Principal component analysis,
</li>
<li>
Locally linear embedding,
</li>
<li>
Local tangent space alignment
</li>
</ul>
<p>Documentation is available at <a href="http://mlweb.loria.fr/lalolab/lalolib.html">http://mlweb.loria.fr/lalolab/lalolib.html</a>
</p>
<h2>LALOLab: a matlab-like development environment</h2>
<p>Try it at <a href="http://mlweb.loria.fr/lalolab/">http://mlweb.loria.fr/lalolab/</a>
</p></html>fabien lauer, pedro ernesto garcia rodriguezFri, 23 Feb 2018 15:40:27 -0000http://mloss.org/software/rss/comments/2166http://mloss.org/revision/view/2166/classificationclusteringregressiondimensionality reductionlinear algebradevelopment environmentscientific computingwebBarista 0.2http://mloss.org/revision/view/2165/<html><p>In recent years, the importance of deep learning has significantly
increased in pattern recognition, computer vision, and artificial intelligence
research, as well as in industry. However, despite the existence
of multiple deep learning frameworks, there is a lack of comprehensible
and easy-to-use high-level tools for the design, training, and testing of
deep neural networks (DNNs). In this paper, we introduce Barista, an
open-source graphical high-level interface for the Caffe deep learning
framework. While Caffe is one of the most popular frameworks for
training DNNs, editing prototext files in order to specify the net architecture
and hyper parameters can become a cumbersome and errorprone
task. Instead, Barista offers a fully graphical user interface with
a graph-based net topology editor and provides an end-to-end training
facility for DNNs, which allows researchers to focus on solving their
problems without having to write code, edit text files, or manually
parse logged data.
</p></html>Soeren Klemm, Aaron Scherzinger, Dominik Drees, Xiaoyi JiangWed, 21 Feb 2018 15:51:11 -0000http://mloss.org/software/rss/comments/2165http://mloss.org/revision/view/2165/neural networksmachine learningArmadillo library 8.400http://mloss.org/revision/view/2164/<html><p>Armadillo is a high quality linear algebra library (matrix maths) for the C++ language, aiming towards a good balance between speed and ease of use.
</p>
<p>Useful for algorithm development directly in C++, or quick conversion of research code into production environments (eg. software & hardware products).
</p>
<p>Provides high-level syntax (API) deliberately similar to MATLAB.
</p>
<p>Provides efficient classes for vectors, matrices and cubes, as well as 200+ associated functions; integer, floating point and complex numbers are supported.
</p>
<p>Various matrix decompositions are provided through integration with LAPACK, or one of its high performance drop-in replacements (eg. multi-threaded Intel MKL or OpenBLAS).
</p>
<p>A sophisticated expression evaluator (based on template meta-programming) automatically combines several operations to increase speed and efficiency.
</p>
<p>Primarily developed by Conrad Sanderson, with contributions from around the world.
</p></html>Conrad Sanderson, Ryan Curtin, many contributorsTue, 20 Feb 2018 03:26:16 -0000http://mloss.org/software/rss/comments/2164http://mloss.org/revision/view/2164/matlabmatrix libraryatlaslapacklinear algebratemplatespython weka wrapper3 0.1.4http://mloss.org/revision/view/2163/<html><p>A thin Python3 wrapper that uses the javabridge Python library to communicate with a Java Virtual Machine executing Weka API calls. Offers all major APIs, like data generators, loaders, savers, filters, classifiers, clusterers, attribute selection, associations and experiments. Weka packages can be listed/installed/uninstalled as well. It does not provide any graphical frontend, but some basic plotting and graph visualizations are available through matplotlib and pygraphviz.
</p></html>peter reutemannSun, 18 Feb 2018 04:54:03 -0000http://mloss.org/software/rss/comments/2163http://mloss.org/revision/view/2163/machine learningwekapython weka wrapper 0.3.12http://mloss.org/revision/view/2162/<html><p>A thin Python wrapper that uses the javabridge Python library to communicate with a Java Virtual Machine executing Weka API calls. Offers all major APIs, like data generators, loaders, savers, filters, classifiers, clusterers, attribute selection, associations and experiments. Weka packages can be listed/installed/uninstalled as well. It does not provide any graphical frontend, but some basic plotting and graph visualizations are available through matplotlib and pygraphviz.
A simple workflow engine was added with release 0.3.0.
</p></html>peter reutemannSun, 18 Feb 2018 04:29:24 -0000http://mloss.org/software/rss/comments/2162http://mloss.org/revision/view/2162/machine learningweka