Projects supporting the libsvm data format.

Logo KeLP 2.0.0

by kelpadmin - November 26, 2015, 16:14:53 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3947 views, 987 downloads, 3 subscriptions

About: Kernel-based Learning Platform (KeLP) is Java framework that supports the implementation of kernel-based learning algorithms, as well as an agile definition of kernel functions over generic data representation, e.g. vectorial data or discrete structures. The framework has been designed to decouple kernel functions and learning algorithms, through the definition of specific interfaces. Once a new kernel function has been implemented, it can be automatically adopted in all the available kernel-machine algorithms. KeLP includes different Online and Batch Learning algorithms for Classification, Regression and Clustering, as well as several Kernel functions, ranging from vector-based to structural kernels. It allows to build complex kernel machine based systems, leveraging on JSON/XML interfaces to instantiate classifiers without writing a single line of code.


This is a major release that includes brand new features as well as a renewed architecture of the entire project.

Now KeLP is organized in four maven projects:

  • kelp-core: it contains the infrastructure of abstract classes and interfaces to work with KeLP. Furthermore, some implementations of algorithms, kernels and representations are included, to provide a base operative environment.

  • kelp-additional-kernels: it contains several kernel functions that extend the set of kernels made available in the kelp-core project. Moreover, this project implements the specific representations required to enable the application of such kernels. In this project the following kernel functions are considered: Sequence kernels, Tree kernels and Graphs kernels.

  • kelp-additional-algorithms: it contains several learning algorithms extending the set of algorithms provided in the kelp-core project, e.g. the C-Support Vector Machine or ν-Support Vector Machine learning algorithms. In particular, advanced learning algorithms for classification and regression can be found in this package. The algorithms are grouped in: 1) Batch Learning, where the complete training dataset is supposed to be entirely available during the learning phase; 2) Online Learning, where individual examples are exploited one at a time to incrementally acquire the model.

  • kelp-full: this is the complete package of KeLP. It aggregates the previous modules in one jar. It contains also a set of fully functioning examples showing how to implement a learning system with KeLP. Batch learning algorithm as well as Online Learning algorithms usage is shown here. Different examples cover the usage of standard kernel, Tree Kernels and Sequence Kernel, with caching mechanisms.

Furthermore this new release includes:

  • CsvDatasetReader: it allows to read files in CSV format

  • DCDLearningAlgorithm: it is the implementation of the Dual Coordinate Descent learning algorithm

  • methods for checking the consistency of a dataset.

Check out this new version from our repositories. API Javadoc is already available. Your suggestions will be very precious for us, so download and try KeLP 2.0.0!

Logo ADAMS 0.4.11

by fracpete - November 18, 2015, 10:58:55 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 15373 views, 3081 downloads, 3 subscriptions

About: The Advanced Data mining And Machine learning System (ADAMS) is a novel, flexible workflow engine aimed at quickly building and maintaining real-world, complex knowledge workflows.


Some highlights of this release:

  • switch to Java 8
  • preferred IDE is now IntelliJ IDEA
  • removed OSX builds
  • 43 new actors
  • 13 new conversions
  • removed obsolete actors and conversions
  • added video support (video files and webcams)
  • added object detection and tracking (incl recording of object trails)
  • proof-of-concept remote-execution of jobs
  • SSH console
  • support for webscraping using JSoup
  • MEKA upgraded to 1.9.0
  • MOA regressor support added
  • better syntax highlighting for Groovy/Jython
  • several new Weka classifiers (eg Veto, LeanMultiScheme, ThresholdedBinaryClassification, InputSmearing)
  • new genetic algorithm: Hermione
  • extended the abstaining classifier framework (integrates with Weka)
  • adams-imaging split into: adams-imaging, adams-boofcv, adams-imagemagick, adams-imagej, adams-openimaj (newly added)

Logo MLweb 0.1.2

by lauerfab - October 9, 2015, 11:55:52 CET [ Project Homepage BibTeX Download ] 1462 views, 401 downloads, 3 subscriptions

About: MLweb is an open source project that aims at bringing machine learning capabilities into web pages and web applications, while maintaining all computations on the client side. It includes (i) a javascript library to enable scientific computing within web pages, (ii) a javascript library implementing machine learning algorithms for classification, regression, clustering and dimensionality reduction, (iii) a web application providing a matlab-like development environment.

  • Add Regression:AutoReg method
  • Add KernelRidgeRegression tuning function
  • More efficient predictions for KRR, SVM, SVR
  • Add BFGS optimization method
  • Faster QR, SVD and eigendecomposition
  • Better support for sparse vectors and matrices
  • Add linear algebra benchmark at
  • Fix plots in LALOlib/ML.js
  • Fix cross-origin issues in new MLlab()
  • Small bug fixes

Logo Somoclu 1.5

by peterwittek - September 30, 2015, 13:27:52 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10107 views, 1989 downloads, 3 subscriptions

About: Somoclu is a massively parallel implementation of self-organizing maps. It relies on OpenMP for multicore execution, MPI for distributing the workload, and it can be accelerated by CUDA on a GPU cluster. A sparse kernel is also included, which is useful for training maps on vector spaces generated in text mining processes. Apart from a command line interface, Python, R, and MATLAB are supported.

  • New: Python interface has visual capabilities.
  • New: Option for hexagonal grid.
  • New: Option for requesting compact support in updating the map.
  • New: Python, R, and MATLAB interfaces now allow passing an initial codebook.
  • Changed: Reduced memory use in calculating U-matrices.
  • Changed: Build system rebuilt and simplified.

Logo python weka wrapper 0.3.3

by fracpete - September 26, 2015, 06:11:42 CET [ Project Homepage BibTeX Download ] 19270 views, 4117 downloads, 3 subscriptions

About: A thin Python wrapper that uses the javabridge Python library to communicate with a Java Virtual Machine executing Weka API calls.

  • updated to Weka 3.7.13
  • documentation now covers the API as well

Logo XGBoost v0.4.0

by crowwork - May 12, 2015, 08:57:16 CET [ Project Homepage BibTeX Download ] 9966 views, 1948 downloads, 3 subscriptions

About: xgboost: eXtreme Gradient Boosting It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithm. The package can automatically do parallel computation with OpenMP, and it can be more than 10 times faster than existing gradient boosting packages such as gbm or sklearn.GBM . It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that user are also allowed to define there own objectives easily. The newest version of xgboost now supports distributed learning on various platforms such as hadoop, mpi and scales to even larger problems

  • Distributed version of xgboost that runs on YARN, scales to billions of examples

  • Direct save/load data and model from/to S3 and HDFS

  • Feature importance visualization in R module, by Michael Benesty

  • Predict leaf index

  • Poisson regression for counts data

  • Early stopping option in training

  • Native save load support in R and python

  • xgboost models now can be saved using save/load in R

  • xgboost python model is now pickable

  • sklearn wrapper is supported in python module

  • Experimental External memory version

Logo JMLR JKernelMachines 2.5

by dpicard - December 11, 2014, 17:51:42 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 22240 views, 5108 downloads, 4 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 4 votes)

About: machine learning library in java for easy development of new kernels


Version 2.5

  • New active learning algorithms
  • Better threading management
  • New multiclass SVM algorithm based on SDCA
  • Handle class balancing in cross-validation
  • Optional support of EJML switch to version 0.26
  • Various bugfixes and improvements

Logo WolfeSVM 0.0

by utmath - November 19, 2014, 10:46:11 CET [ Project Homepage BibTeX Download ] 1130 views, 308 downloads, 2 subscriptions

About: This is a library for solving nu-SVM by using Wolfe's minimum norm point algorithm. You can solve binary classification problem.


Initial Announcement on

Logo LIBOL 0.3.0

by stevenhoi - December 12, 2013, 15:26:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12809 views, 4576 downloads, 2 subscriptions

About: LIBOL is an open-source library with a family of state-of-the-art online learning algorithms for machine learning and big data analytics research. The current version supports 16 online algorithms for binary classification and 13 online algorithms for multiclass classification.


In contrast to our last version (V0.2.3), the new version (V0.3.0) has made some important changes as follows:

• Add a template and guide for adding new algorithms;

• Improve parameter settings and make documentation clear;

• Improve documentation on data formats and key functions;

• Amend the "OGD" function to use different loss types;

• Fixed some name inconsistency and other minor bugs.

Logo OpenANN 1.1.0

by afabisch - September 26, 2013, 23:52:03 CET [ Project Homepage BibTeX Download ] 4294 views, 905 downloads, 2 subscriptions

About: A library for artificial neural networks.


Added algorithms:

  • L-BFGS optimizer
  • k-means
  • sparse auto-encoder
  • preprocessing: normalization, PCA, ZCA whitening

Logo Nen Beta

by pascal - February 19, 2012, 00:31:34 CET [ Project Homepage BibTeX Download ] 4457 views, 1253 downloads, 1 subscription

About: 3-layer neural network for regression with sigmoid activation function and command line interface similar to LibSVM.


Initial Announcement on