Projects that are tagged with regression.
Showing Items 1-20 of 43 on page 1 of 3: 1 2 3 Next

Logo KeLP 2.0.1

by kelpadmin - January 13, 2016, 12:47:31 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 5713 views, 1418 downloads, 3 subscriptions

About: Kernel-based Learning Platform (KeLP) is Java framework that supports the implementation of kernel-based learning algorithms, as well as an agile definition of kernel functions over generic data representation, e.g. vectorial data or discrete structures. The framework has been designed to decouple kernel functions and learning algorithms, through the definition of specific interfaces. Once a new kernel function has been implemented, it can be automatically adopted in all the available kernel-machine algorithms. KeLP includes different Online and Batch Learning algorithms for Classification, Regression and Clustering, as well as several Kernel functions, ranging from vector-based to structural kernels. It allows to build complex kernel machine based systems, leveraging on JSON/XML interfaces to instantiate prediction models without writing a single line of code.

Changes:

In addition to minor bug fixes, this release includes:

  • Soft Confidence Weighted Classification algorithm: a brand new online learning algorithm from Wang, J., Zhao, P., Hoi, S.C.: Exact soft confidence-weighted learning. In Proceedings of the ICML 2012. ACM, New York, NY, USA (2012)

  • Optimization of the kernel caching mechanism

  • The Smooth Partial Tree Kernel and the Partial Tree Kernel now have the possibility to specify a maximum branching factor (parameter: maxSubseqLeng) in the tree fragments considered by the kernel operation.

Check out this new version from our repositories. API Javadoc is already available. Your suggestions will be very precious for us, so download and try KeLP 2.0.1!


Logo MLweb 0.1.3

by lauerfab - December 17, 2015, 10:29:35 CET [ Project Homepage BibTeX Download ] 2687 views, 662 downloads, 3 subscriptions

About: MLweb is an open source project that aims at bringing machine learning capabilities into web pages and web applications, while maintaining all computations on the client side. It includes (i) a javascript library to enable scientific computing within web pages, (ii) a javascript library implementing machine learning algorithms for classification, regression, clustering and dimensionality reduction, (iii) a web application providing a matlab-like development environment.

Changes:
  • Improve NaiveBayes classifier
  • Add online training functions for KNN and NaiveBayes
  • Fix save/load workspace in LALOLab
  • Fix nullspace()
  • Small bug fixes

Logo JMLR dlib ml 18.18

by davis685 - October 29, 2015, 01:48:44 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 128606 views, 21259 downloads, 4 subscriptions

About: This project is a C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real world problems.

Changes:

This release has focused on build system improvements, both for the Python API and C++ builds using CMake. This includes adding a setup.py script for installing the dlib Python API as well as a make install target for installing a C++ shared library for non-Python use.


Logo SALSA.jl 0.0.5

by jumutc - September 28, 2015, 17:28:56 CET [ Project Homepage BibTeX Download ] 838 views, 152 downloads, 1 subscription

About: SALSA (Software lab for Advanced machine Learning with Stochastic Algorithms) is an implementation of the well-known stochastic algorithms for Machine Learning developed in the high-level technical computing language Julia. The SALSA software package is designed to address challenges in sparse linear modelling, linear and non-linear Support Vector Machines applied to large data samples with user-centric and user-friendly emphasis.

Changes:

Initial Announcement on mloss.org.


Logo WEKA 3.7.13

by mhall - September 11, 2015, 04:55:02 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 53866 views, 8001 downloads, 4 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 6 votes)

About: The Weka workbench contains a collection of visualization tools and algorithms for data analysis and predictive modelling, together with graphical user interfaces for easy access to this [...]

Changes:

In core weka:

  • Numerically stable implementation of variance calculation in core Weka classes - thanks to Benjamin Weber
  • Unified expression parsing framework (with compiled expressions) is now employed by filters and tools that use mathematical/logical expressions - thanks to Benjamin Weber
  • Developers can now specify GUI and command-line options for their Weka schemes via a new unified annotation-based mechanism
  • ClassConditionalProbabilities filter - replaces the value of a nominal attribute in a given instance with its probability given each of the possible class values
  • GUI package manager's available list now shows both packages that are not currently installed, and those installed packages for which there is a more recent version available that is compatible with the base version of Weka being used
  • ReplaceWithMissingValue filter - allows values to be randomly (with a user-specified probability) replaced with missing values. Useful for experimenting with methods for imputing missing values
  • WrapperSubsetEval can now use plugin evaluation metrics

In packages:

  • alternatingModelTrees package - alternating trees for regression
  • timeSeriesFilters package, contributed by Benjamin Weber
  • distributedWekaSpark package - wrapper for distributed Weka on Spark
  • wekaPython package - execution of CPython scripts and wrapper classifier/clusterer for Scikit Learn schemes
  • MLRClassifier in RPlugin now provides access to almost all classification and regression learners in MLR 2.4

Logo YCML 0.2.2

by yconst - August 24, 2015, 20:28:45 CET [ Project Homepage BibTeX Download ] 1021 views, 211 downloads, 3 subscriptions

About: A Machine Learning framework for Objective-C and Swift (OS X / iOS)

Changes:

Initial Announcement on mloss.org.


Logo JMLR GPstuff 4.6

by avehtari - July 15, 2015, 15:08:06 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 28953 views, 6807 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2015-07-09 Version 4.6

Development and release branches available at https://github.com/gpstuff-dev/gpstuff

New features

  • Use Pareto smoothed importance sampling (Vehtari & Gelman, 2015) for

  • importance sampling leave-one-out cross-validation (gpmc_loopred.m)

  • importance sampling integration over hyperparameters (gp_ia.m)

  • importance sampling part of the logistic Gaussian process density estimation (lgpdens.m)

  • references:

    • Aki Vehtari and Andrew Gelman (2015). Pareto smoothed importance sampling. arXiv preprint arXiv:1507.02646.
    • Aki Vehtari, Andrew Gelman and Jonah Gabry (2015). Efficient implementation of leave-one-out cross-validation and WAIC for evaluating fitted Bayesian models.
  • New covariance functions

    • gpcf_additive creates a mixture over products of kernels for each dimension reference: Duvenaud, D. K., Nickisch, H., & Rasmussen, C. E. (2011). Additive Gaussian processes. In Advances in neural information processing systems, pp. 226-234.
    • gpcf_linearLogistic corresponds to logistic mean function
    • gpcf_linearMichelismenten correpsonds Michelis Menten mean function

Improvements - faster EP moment calculation for lik_logit

Several minor bugfixes


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.6

by hn - July 6, 2015, 12:31:28 CET [ Project Homepage BibTeX Download ] 30804 views, 7135 downloads, 4 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

Changes:
  • added a new inference function infGrid_Laplace allowing to use non-Gaussian likelihoods for large grids

  • fixed a bug due to Octave evaluating norm([]) to a tiny nonzero value, modified all lik/lik*.m functions reported by Philipp Richter

  • small bugfixes in covGrid and infGrid

  • bugfix in predictive variance of likNegBinom due to Seth Flaxman

  • bugfix in infFITC_Laplace as suggested by Wu Lin

  • bugfix in covPP{iso,ard}


Logo Hivemall 0.3

by myui - March 13, 2015, 17:08:22 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 8202 views, 1367 downloads, 3 subscriptions

About: Hivemall is a scalable machine learning library running on Hive/Hadoop.

Changes:
  • Supported Matrix Factorization
  • Added a support for TF-IDF computation
  • Supported AdaGrad/AdaDelta
  • Supported AdaGradRDA classification
  • Added normalization scheme

Logo pyGPs 1.3.2

by mn - January 17, 2015, 13:08:43 CET [ Project Homepage BibTeX Download ] 6473 views, 1559 downloads, 4 subscriptions

About: pyGPs is a Python package for Gaussian process (GP) regression and classification for machine learning.

Changes:

Changelog pyGPs v1.3.2

December 15th 2014

  • pyGPs added to pip
  • mathematical definitions of kernel functions available in documentation
  • more error message added

Logo The Statistical ToolKit 0.8.4

by joblion - December 5, 2014, 13:21:47 CET [ Project Homepage BibTeX Download ] 2124 views, 641 downloads, 2 subscriptions

About: STK++: A Statistical Toolkit Framework in C++

Changes:

Inegrating openmp to the current release. Many enhancement in the clustering project. bug fix


Logo linearizedGP 1.0

by dsteinberg - November 28, 2014, 07:02:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1632 views, 378 downloads, 1 subscription

About: Gaussian processes with general nonlinear likelihoods using the unscented transform or Taylor series linearisation.

Changes:

Initial Announcement on mloss.org.


Logo Boosted Decision Trees and Lists 1.0.4

by melamed - July 25, 2014, 23:08:32 CET [ BibTeX Download ] 5354 views, 1617 downloads, 3 subscriptions

About: Boosting algorithms for classification and regression, with many variations. Features include: Scalable and robust; Easily customizable loss functions; One-shot training for an entire regularization path; Continuous checkpointing; much more

Changes:
  • added ElasticNets as a regularization option
  • fixed some segfaults, memory leaks, and out-of-range errors, which were creeping in in some corner cases
  • added a couple of I/O optimizations

Logo Kernel Adaptive Filtering Toolbox 1.4

by steven2358 - May 26, 2014, 18:24:23 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6518 views, 1155 downloads, 1 subscription

About: A Matlab benchmarking toolbox for online and adaptive regression with kernels.

Changes:
  • Improvements and demo script for profiler
  • Initial version of documentation
  • Several new algorithms

Logo MLDemos 0.5.1

by basilio - March 2, 2013, 16:06:13 CET [ Project Homepage BibTeX Download ] 25075 views, 5703 downloads, 2 subscriptions

About: MLDemos is a user-friendly visualization interface for various machine learning algorithms for classification, regression, clustering, projection, dynamical systems, reward maximisation and reinforcement learning.

Changes:

New Visualization and Dataset Features Added 3D visualization of samples and classification, regression and maximization results Added Visualization panel with individual plots, correlations, density, etc. Added Editing tools to drag/magnet data, change class, increase or decrease dimensions of the dataset Added categorical dimensions (indexed dimensions with non-numerical values) Added Dataset Editing panel to swap, delete and rename dimensions, classes or categorical values Several bug-fixes for display, import/export of data, classification performance

New Algorithms and methodologies Added Projections to pre-process data (which can then be classified/regressed/clustered), with LDA, PCA, KernelPCA, ICA, CCA Added Grid-Search panel for batch-testing ranges of values for up to two parameters at a time Added One-vs-All multi-class classification for non-multi-class algorithms Trained models can now be kept and tested on new data (training on one dataset, testing on another) Added a dataset generator panel for standard toy datasets (e.g. swissroll, checkerboard,...) Added a number of clustering, regression and classification algorithms (FLAME, DBSCAN, LOWESS, CCA, KMEANS++, GP Classification, Random Forests) Added Save/Load Model option for GMMs and SVMs Added Growing Hierarchical Self Organizing Maps (original code by Michael Dittenbach) Added Automatic Relevance Determination for SVM with RBF kernel (Thanks to Ashwini Shukla!)


Logo Orange 2.6

by janez - February 14, 2013, 18:15:08 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 15631 views, 3019 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 1 vote)

About: Orange is a component-based machine learning and data mining software. It includes a friendly yet powerful and flexible graphical user interface for visual programming. For more advanced use(r)s, [...]

Changes:

The core of the system (except the GUI) no longer includes any GPL code and can be licensed under the terms of BSD upon request. The graphical part remains under GPL.

Changed the BibTeX reference to the paper recently published in JMLR MLOSS.


About: This local and parallel computation toolbox is the Octave and Matlab implementation of several localized Gaussian process regression methods: the domain decomposition method (Park et al., 2011, DDM), partial independent conditional (Snelson and Ghahramani, 2007, PIC), localized probabilistic regression (Urtasun and Darrell, 2008, LPR), and bagging for Gaussian process regression (Chen and Ren, 2009, BGP). Most of the localized regression methods can be applied for general machine learning problems although DDM is only applicable for spatial datasets. In addition, the GPLP provides two parallel computation versions of the domain decomposition method. The easiness of being parallelized is one of the advantages of the localized regression, and the two parallel implementations will provide a good guidance about how to materialize this advantage as software.

Changes:

Initial Announcement on mloss.org.


Logo MLPY Machine Learning Py 3.5.0

by albanese - March 15, 2012, 09:52:41 CET [ Project Homepage BibTeX Download ] 64852 views, 12079 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole Star1/2 StarEmpty Star
(based on 3 votes)

About: mlpy is a Python module for Machine Learning built on top of NumPy/SciPy and of GSL.

Changes:

New features:

  • LibSvm(): pred_probability() now returns probability estimates; pred_values() added
  • LibLinear(): pred_values() and pred_probability() added
  • dtw_std: squared Euclidean option added
  • LCS for series composed by real values (lcs_real()) added
  • Documentation

Fix:

  • wavelet submodule: cwt(): it returned only real values in morlet and poul
  • IRelief(): remove np. in learn()
  • fix rfe_kfda and rfe_w2 when p=1

Logo JMLR LWPR 1.2.4

by sklanke - February 6, 2012, 19:55:41 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 32478 views, 4055 downloads, 1 subscription

About: Locally Weighted Projection Regression (LWPR) is a recent algorithm that achieves nonlinear function approximation in high dimensional spaces with redundant and irrelevant input dimensions. At its [...]

Changes:

Version 1.2.4

  • Corrected typo in lwpr.c (wrong function name for multi-threaded helper function on Unix systems) Thanks to Jose Luis Rivero

Logo Kernel Machine Library 0.2

by pawelm - December 27, 2011, 17:14:01 CET [ Project Homepage BibTeX BibTeX for corresponding Paper ] 5362 views, 222 downloads, 1 subscription

About: The Kernel-Machine Library is a free (released under the LGPL) C++ library to promote the use of and progress of kernel machines.

Changes:

Updated mloss entry (minor fixes).


Showing Items 1-20 of 43 on page 1 of 3: 1 2 3 Next