Projects that are tagged with regression.
Showing Items 1-20 of 43 on page 1 of 3: 1 2 3 Next

Logo KeLP 2.0.0

by kelpadmin - November 26, 2015, 16:14:53 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3995 views, 993 downloads, 3 subscriptions

About: Kernel-based Learning Platform (KeLP) is Java framework that supports the implementation of kernel-based learning algorithms, as well as an agile definition of kernel functions over generic data representation, e.g. vectorial data or discrete structures. The framework has been designed to decouple kernel functions and learning algorithms, through the definition of specific interfaces. Once a new kernel function has been implemented, it can be automatically adopted in all the available kernel-machine algorithms. KeLP includes different Online and Batch Learning algorithms for Classification, Regression and Clustering, as well as several Kernel functions, ranging from vector-based to structural kernels. It allows to build complex kernel machine based systems, leveraging on JSON/XML interfaces to instantiate classifiers without writing a single line of code.


This is a major release that includes brand new features as well as a renewed architecture of the entire project.

Now KeLP is organized in four maven projects:

  • kelp-core: it contains the infrastructure of abstract classes and interfaces to work with KeLP. Furthermore, some implementations of algorithms, kernels and representations are included, to provide a base operative environment.

  • kelp-additional-kernels: it contains several kernel functions that extend the set of kernels made available in the kelp-core project. Moreover, this project implements the specific representations required to enable the application of such kernels. In this project the following kernel functions are considered: Sequence kernels, Tree kernels and Graphs kernels.

  • kelp-additional-algorithms: it contains several learning algorithms extending the set of algorithms provided in the kelp-core project, e.g. the C-Support Vector Machine or ν-Support Vector Machine learning algorithms. In particular, advanced learning algorithms for classification and regression can be found in this package. The algorithms are grouped in: 1) Batch Learning, where the complete training dataset is supposed to be entirely available during the learning phase; 2) Online Learning, where individual examples are exploited one at a time to incrementally acquire the model.

  • kelp-full: this is the complete package of KeLP. It aggregates the previous modules in one jar. It contains also a set of fully functioning examples showing how to implement a learning system with KeLP. Batch learning algorithm as well as Online Learning algorithms usage is shown here. Different examples cover the usage of standard kernel, Tree Kernels and Sequence Kernel, with caching mechanisms.

Furthermore this new release includes:

  • CsvDatasetReader: it allows to read files in CSV format

  • DCDLearningAlgorithm: it is the implementation of the Dual Coordinate Descent learning algorithm

  • methods for checking the consistency of a dataset.

Check out this new version from our repositories. API Javadoc is already available. Your suggestions will be very precious for us, so download and try KeLP 2.0.0!

Logo JMLR dlib ml 18.18

by davis685 - October 29, 2015, 01:48:44 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 120255 views, 20010 downloads, 4 subscriptions

About: This project is a C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real world problems.


This release has focused on build system improvements, both for the Python API and C++ builds using CMake. This includes adding a script for installing the dlib Python API as well as a make install target for installing a C++ shared library for non-Python use.

Logo MLweb 0.1.2

by lauerfab - October 9, 2015, 11:55:52 CET [ Project Homepage BibTeX Download ] 1464 views, 402 downloads, 3 subscriptions

About: MLweb is an open source project that aims at bringing machine learning capabilities into web pages and web applications, while maintaining all computations on the client side. It includes (i) a javascript library to enable scientific computing within web pages, (ii) a javascript library implementing machine learning algorithms for classification, regression, clustering and dimensionality reduction, (iii) a web application providing a matlab-like development environment.

  • Add Regression:AutoReg method
  • Add KernelRidgeRegression tuning function
  • More efficient predictions for KRR, SVM, SVR
  • Add BFGS optimization method
  • Faster QR, SVD and eigendecomposition
  • Better support for sparse vectors and matrices
  • Add linear algebra benchmark at
  • Fix plots in LALOlib/ML.js
  • Fix cross-origin issues in new MLlab()
  • Small bug fixes

Logo SALSA.jl 0.0.5

by jumutc - September 28, 2015, 17:28:56 CET [ Project Homepage BibTeX Download ] 536 views, 85 downloads, 1 subscription

About: SALSA (Software lab for Advanced machine Learning with Stochastic Algorithms) is an implementation of the well-known stochastic algorithms for Machine Learning developed in the high-level technical computing language Julia. The SALSA software package is designed to address challenges in sparse linear modelling, linear and non-linear Support Vector Machines applied to large data samples with user-centric and user-friendly emphasis.


Initial Announcement on

Logo WEKA 3.7.13

by mhall - September 11, 2015, 04:55:02 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 51427 views, 7627 downloads, 4 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 6 votes)

About: The Weka workbench contains a collection of visualization tools and algorithms for data analysis and predictive modelling, together with graphical user interfaces for easy access to this [...]


In core weka:

  • Numerically stable implementation of variance calculation in core Weka classes - thanks to Benjamin Weber
  • Unified expression parsing framework (with compiled expressions) is now employed by filters and tools that use mathematical/logical expressions - thanks to Benjamin Weber
  • Developers can now specify GUI and command-line options for their Weka schemes via a new unified annotation-based mechanism
  • ClassConditionalProbabilities filter - replaces the value of a nominal attribute in a given instance with its probability given each of the possible class values
  • GUI package manager's available list now shows both packages that are not currently installed, and those installed packages for which there is a more recent version available that is compatible with the base version of Weka being used
  • ReplaceWithMissingValue filter - allows values to be randomly (with a user-specified probability) replaced with missing values. Useful for experimenting with methods for imputing missing values
  • WrapperSubsetEval can now use plugin evaluation metrics

In packages:

  • alternatingModelTrees package - alternating trees for regression
  • timeSeriesFilters package, contributed by Benjamin Weber
  • distributedWekaSpark package - wrapper for distributed Weka on Spark
  • wekaPython package - execution of CPython scripts and wrapper classifier/clusterer for Scikit Learn schemes
  • MLRClassifier in RPlugin now provides access to almost all classification and regression learners in MLR 2.4

Logo YCML 0.2.2

by yconst - August 24, 2015, 20:28:45 CET [ Project Homepage BibTeX Download ] 806 views, 152 downloads, 3 subscriptions

About: A Machine Learning framework for Objective-C and Swift (OS X / iOS)


Initial Announcement on

Logo JMLR GPstuff 4.6

by avehtari - July 15, 2015, 15:08:06 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 26600 views, 6280 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.


2015-07-09 Version 4.6

Development and release branches available at

New features

  • Use Pareto smoothed importance sampling (Vehtari & Gelman, 2015) for

  • importance sampling leave-one-out cross-validation (gpmc_loopred.m)

  • importance sampling integration over hyperparameters (gp_ia.m)

  • importance sampling part of the logistic Gaussian process density estimation (lgpdens.m)

  • references:

    • Aki Vehtari and Andrew Gelman (2015). Pareto smoothed importance sampling. arXiv preprint arXiv:1507.02646.
    • Aki Vehtari, Andrew Gelman and Jonah Gabry (2015). Efficient implementation of leave-one-out cross-validation and WAIC for evaluating fitted Bayesian models.
  • New covariance functions

    • gpcf_additive creates a mixture over products of kernels for each dimension reference: Duvenaud, D. K., Nickisch, H., & Rasmussen, C. E. (2011). Additive Gaussian processes. In Advances in neural information processing systems, pp. 226-234.
    • gpcf_linearLogistic corresponds to logistic mean function
    • gpcf_linearMichelismenten correpsonds Michelis Menten mean function

Improvements - faster EP moment calculation for lik_logit

Several minor bugfixes

Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.6

by hn - July 6, 2015, 12:31:28 CET [ Project Homepage BibTeX Download ] 29019 views, 6776 downloads, 4 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

  • added a new inference function infGrid_Laplace allowing to use non-Gaussian likelihoods for large grids

  • fixed a bug due to Octave evaluating norm([]) to a tiny nonzero value, modified all lik/lik*.m functions reported by Philipp Richter

  • small bugfixes in covGrid and infGrid

  • bugfix in predictive variance of likNegBinom due to Seth Flaxman

  • bugfix in infFITC_Laplace as suggested by Wu Lin

  • bugfix in covPP{iso,ard}

Logo Hivemall 0.3

by myui - March 13, 2015, 17:08:22 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 7408 views, 1238 downloads, 3 subscriptions

About: Hivemall is a scalable machine learning library running on Hive/Hadoop.

  • Supported Matrix Factorization
  • Added a support for TF-IDF computation
  • Supported AdaGrad/AdaDelta
  • Supported AdaGradRDA classification
  • Added normalization scheme

Logo pyGPs 1.3.2

by mn - January 17, 2015, 13:08:43 CET [ Project Homepage BibTeX Download ] 5894 views, 1454 downloads, 4 subscriptions

About: pyGPs is a Python package for Gaussian process (GP) regression and classification for machine learning.


Changelog pyGPs v1.3.2

December 15th 2014

  • pyGPs added to pip
  • mathematical definitions of kernel functions available in documentation
  • more error message added

Logo The Statistical ToolKit 0.8.4

by joblion - December 5, 2014, 13:21:47 CET [ Project Homepage BibTeX Download ] 1834 views, 585 downloads, 2 subscriptions

About: STK++: A Statistical Toolkit Framework in C++


Inegrating openmp to the current release. Many enhancement in the clustering project. bug fix

Logo linearizedGP 1.0

by dsteinberg - November 28, 2014, 07:02:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1360 views, 329 downloads, 1 subscription

About: Gaussian processes with general nonlinear likelihoods using the unscented transform or Taylor series linearisation.


Initial Announcement on

Logo Boosted Decision Trees and Lists 1.0.4

by melamed - July 25, 2014, 23:08:32 CET [ BibTeX Download ] 4944 views, 1506 downloads, 3 subscriptions

About: Boosting algorithms for classification and regression, with many variations. Features include: Scalable and robust; Easily customizable loss functions; One-shot training for an entire regularization path; Continuous checkpointing; much more

  • added ElasticNets as a regularization option
  • fixed some segfaults, memory leaks, and out-of-range errors, which were creeping in in some corner cases
  • added a couple of I/O optimizations

Logo Kernel Adaptive Filtering Toolbox 1.4

by steven2358 - May 26, 2014, 18:24:23 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 5983 views, 1067 downloads, 1 subscription

About: A Matlab benchmarking toolbox for online and adaptive regression with kernels.

  • Improvements and demo script for profiler
  • Initial version of documentation
  • Several new algorithms

Logo MLDemos 0.5.1

by basilio - March 2, 2013, 16:06:13 CET [ Project Homepage BibTeX Download ] 23753 views, 5444 downloads, 2 subscriptions

About: MLDemos is a user-friendly visualization interface for various machine learning algorithms for classification, regression, clustering, projection, dynamical systems, reward maximisation and reinforcement learning.


New Visualization and Dataset Features Added 3D visualization of samples and classification, regression and maximization results Added Visualization panel with individual plots, correlations, density, etc. Added Editing tools to drag/magnet data, change class, increase or decrease dimensions of the dataset Added categorical dimensions (indexed dimensions with non-numerical values) Added Dataset Editing panel to swap, delete and rename dimensions, classes or categorical values Several bug-fixes for display, import/export of data, classification performance

New Algorithms and methodologies Added Projections to pre-process data (which can then be classified/regressed/clustered), with LDA, PCA, KernelPCA, ICA, CCA Added Grid-Search panel for batch-testing ranges of values for up to two parameters at a time Added One-vs-All multi-class classification for non-multi-class algorithms Trained models can now be kept and tested on new data (training on one dataset, testing on another) Added a dataset generator panel for standard toy datasets (e.g. swissroll, checkerboard,...) Added a number of clustering, regression and classification algorithms (FLAME, DBSCAN, LOWESS, CCA, KMEANS++, GP Classification, Random Forests) Added Save/Load Model option for GMMs and SVMs Added Growing Hierarchical Self Organizing Maps (original code by Michael Dittenbach) Added Automatic Relevance Determination for SVM with RBF kernel (Thanks to Ashwini Shukla!)

Logo Orange 2.6

by janez - February 14, 2013, 18:15:08 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 14963 views, 2845 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 1 vote)

About: Orange is a component-based machine learning and data mining software. It includes a friendly yet powerful and flexible graphical user interface for visual programming. For more advanced use(r)s, [...]


The core of the system (except the GUI) no longer includes any GPL code and can be licensed under the terms of BSD upon request. The graphical part remains under GPL.

Changed the BibTeX reference to the paper recently published in JMLR MLOSS.

About: This local and parallel computation toolbox is the Octave and Matlab implementation of several localized Gaussian process regression methods: the domain decomposition method (Park et al., 2011, DDM), partial independent conditional (Snelson and Ghahramani, 2007, PIC), localized probabilistic regression (Urtasun and Darrell, 2008, LPR), and bagging for Gaussian process regression (Chen and Ren, 2009, BGP). Most of the localized regression methods can be applied for general machine learning problems although DDM is only applicable for spatial datasets. In addition, the GPLP provides two parallel computation versions of the domain decomposition method. The easiness of being parallelized is one of the advantages of the localized regression, and the two parallel implementations will provide a good guidance about how to materialize this advantage as software.


Initial Announcement on

Logo MLPY Machine Learning Py 3.5.0

by albanese - March 15, 2012, 09:52:41 CET [ Project Homepage BibTeX Download ] 62043 views, 11584 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole Star1/2 StarEmpty Star
(based on 3 votes)

About: mlpy is a Python module for Machine Learning built on top of NumPy/SciPy and of GSL.


New features:

  • LibSvm(): pred_probability() now returns probability estimates; pred_values() added
  • LibLinear(): pred_values() and pred_probability() added
  • dtw_std: squared Euclidean option added
  • LCS for series composed by real values (lcs_real()) added
  • Documentation


  • wavelet submodule: cwt(): it returned only real values in morlet and poul
  • IRelief(): remove np. in learn()
  • fix rfe_kfda and rfe_w2 when p=1

Logo JMLR LWPR 1.2.4

by sklanke - February 6, 2012, 19:55:41 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 31620 views, 3955 downloads, 1 subscription

About: Locally Weighted Projection Regression (LWPR) is a recent algorithm that achieves nonlinear function approximation in high dimensional spaces with redundant and irrelevant input dimensions. At its [...]


Version 1.2.4

  • Corrected typo in lwpr.c (wrong function name for multi-threaded helper function on Unix systems) Thanks to Jose Luis Rivero

Logo Kernel Machine Library 0.2

by pawelm - December 27, 2011, 17:14:01 CET [ Project Homepage BibTeX BibTeX for corresponding Paper ] 5003 views, 202 downloads, 1 subscription

About: The Kernel-Machine Library is a free (released under the LGPL) C++ library to promote the use of and progress of kernel machines.


Updated mloss entry (minor fixes).

Showing Items 1-20 of 43 on page 1 of 3: 1 2 3 Next