Projects that also appeared in JMLR.
Showing Items 1-20 of 43 on page 1 of 3: 1 2 3 Next

Logo JMLR MLPACK 1.0.11

by rcurtin - December 11, 2014, 18:20:35 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 35675 views, 6999 downloads, 6 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 1 vote)

About: A scalable, fast C++ machine learning library, with emphasis on usability.

Changes:
  • Proper handling of dimension calculation in PCA.
  • Load parameter vectors properly for LinearRegression models.
  • Linker fixes for AugLagrangian specializations under Visual Studio.
  • Add support for observation weights to LinearRegression.
  • MahalanobisDistance<> now takes root of the distance by default and therefore satisfies the triangle inequality (TakeRoot now defaults to true).
  • Better handling of optional Armadillo HDF5 dependency.
  • Fixes for numerous intermittent test failures.
  • math::RandomSeed() now sets the seed for recent (>= 3.930) Armadillo versions.
  • Handle Newton method convergence better for SparseCoding::OptimizeDictionary() and make maximum iterations a parameter.
  • Known bug: CosineTree construction may fail in some cases on i386 systems (376).

Logo JMLR JKernelMachines 2.5

by dpicard - December 11, 2014, 17:51:42 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 13968 views, 3407 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 4 votes)

About: machine learning library in java for easy development of new kernels

Changes:

Version 2.5

  • New active learning algorithms
  • Better threading management
  • New multiclass SVM algorithm based on SDCA
  • Handle class balancing in cross-validation
  • Optional support of EJML switch to version 0.26
  • Various bugfixes and improvements

Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.5

by hn - December 8, 2014, 13:54:38 CET [ Project Homepage BibTeX Download ] 20492 views, 4793 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

Changes:
  • mechanism for specifying hyperparameter priors (together with Roman Garnett and José Vallet)
  • new inference method inf/infGrid allowing efficient inference for data defined on a Cartesian grid (together with Andrew Wilson)
  • new mean/cov functions for preference learning: meanPref/covPref
  • new mean/cov functions for non-vectorial data: meanDiscrete/covDiscrete
  • new piecewise constant nearest neighbor mean function: meanNN
  • new mean functions being predictions from GPs: meanGP and meanGPexact
  • new covariance function for standard additive noise: covEye
  • new covariance function for factor analysis: covSEfact
  • new covariance function with varying length scale : covSEvlen
  • make covScale more general to scaling with a function instead of a scalar
  • bugfix in covGabor* and covSM (due to Andrew Gordon Wilson)
  • bugfix in lik/likBeta.m (suggested by Dali Wei)
  • bugfix in solve_chol.c (due to Todd Small)
  • bugfix in FITC inference mode (due to Joris Mooij) where the wrong mode for post.L was chosen when using infFITC and post.L being a diagonal matrix
  • bugfix in infVB marginal likelihood for likLogistic with nonzero mean function (reported by James Lloyd)
  • removed the combination likErf/infVB as it yields a bad posterior approximation and lacks theoretical justification
  • Matlab and Octave compilation for L-BFGS-B v2.4 and the more recent L-BFGS-B v3.0 (contributed by José Vallet)
  • smaller bugfixes in gp.m (due to Joris Mooij and Ernst Kloppenburg)
  • bugfix in lik/likBeta.m (due to Dali Wei)
  • updated use of logphi in lik/likErf
  • bugfix in util/solve_chol.c where a typing issue occured on OS X (due to Todd Small)
  • bugfix due to Bjørn Sand Jensen noticing that cov_deriv_sq_dist.m was missing in the distribution
  • bugfix in infFITC_EP for ttau->inf (suggested by Ryan Turner)

Logo JMLR Sally 0.9.2

by konrad - November 19, 2014, 20:28:35 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 22147 views, 4478 downloads, 3 subscriptions

About: A Tool for Embedding Strings in Vector Spaces

Changes:

Fixed severe bug in concurrent computation of blended n-grams.


Logo JMLR dlib ml 18.11

by davis685 - November 13, 2014, 23:42:18 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 87567 views, 15168 downloads, 2 subscriptions

About: This project is a C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real world problems.

Changes:

This release contains mostly minor bug fixes and usability improvements, with the notable exception of new routines for extracting local-binary-pattern features from images and improved tools for learning distance metrics.


Logo JMLR Darwin 1.8

by sgould - September 3, 2014, 08:42:53 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 30725 views, 6438 downloads, 4 subscriptions

About: A platform-independent C++ framework for machine learning, graphical models, and computer vision research and development.

Changes:

Version 1.8:

  • Added Superpixel Graph Label Transfer (nnGraph) Project project
  • Added Python scripts for automating some projects
  • Added ability to pre-process features on-the-fly with one drwnFeatureTransform when applying or learning another drwnFeatureTransform
  • Fixed race condition in Windows threading (thanks to Edison Guo)
  • Switched Windows and Linux to build against OpenCV 2.4.9
  • Changed drwnMAPInference::inference to return upper and lower energy bounds
  • Added pruneRounds function to drwnBoostedClassifier
  • Added drwnSLICSuperpixels function
  • Added drwnIndexQueue class
  • mexLearnClassifier and mexAnalyseClassifier now support integer label types
  • Bug fix in mexSaveSuperpixels to support single channel

Logo JMLR GPstuff 4.5

by avehtari - July 22, 2014, 14:03:11 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 16043 views, 3873 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2014-07-22 Version 4.5

New features

  • Input dependent noise and signal variance.

    • Tolvanen, V., Jylänki, P. and Vehtari, A. (2014). Expectation Propagation for Nonstationary Heteroscedastic Gaussian Process Regression. In Proceedings of IEEE International Workshop on Machine Learning for Signal Processing, accepted for publication. Preprint http://arxiv.org/abs/1404.5443
  • Sparse stochastic variational inference model.

    • Hensman, J., Fusi, N. and Lawrence, N. D. (2013). Gaussian processes for big data. arXiv preprint http://arxiv.org/abs/1309.6835.
  • Option 'autoscale' in the gp_rnd.m to get split normal approximated samples from the posterior predictive distribution of the latent variable.

    • Geweke, J. (1989). Bayesian Inference in Econometric Models Using Monte Carlo Integration. Econometrica, 57(6):1317-1339.

    • Villani, M. and Larsson, R. (2006). The Multivariate Split Normal Distribution and Asymmetric Principal Components Analysis. Communications in Statistics - Theory and Methods, 35(6):1123-1140.

Improvements

  • New unit test environment using the Matlab built-in test framework (the old Xunit package is still also supported).
  • Precomputed demo results (including the figures) are now available in the folder tests/realValues.
  • New demos demonstrating new features etc.
    • demo_epinf, demonstrating the input dependent noise and signal variance model
    • demo_svi_regression, demo_svi_classification
    • demo_modelcomparison2, demo_survival_comparison

Several minor bugfixes


Logo JMLR Waffles 2014-07-05

by mgashler - July 20, 2014, 04:53:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 25269 views, 7387 downloads, 2 subscriptions

About: Script-friendly command-line tools for machine learning and data mining tasks. (The command-line tools wrap functionality from a public domain C++ class library.)

Changes:

Added support for CUDA GPU-parallelized neural network layers, and several other new features. Full list of changes at http://waffles.sourceforge.net/docs/changelog.html


Logo JMLR MSVMpack 1.5

by lauerfab - July 3, 2014, 16:02:49 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12484 views, 4147 downloads, 2 subscriptions

About: MSVMpack is a Multi-class Support Vector Machine (M-SVM) package. It is dedicated to SVMs which can handle more than two classes without relying on decomposition methods and implements the four M-SVM models from the literature: Weston and Watkins M-SVM, Crammer and Singer M-SVM, Lee, Lin and Wahba M-SVM, and the M-SVM2 of Guermeur and Monfrini.

Changes:
  • Windows binaries are now included (by Emmanuel Didiot)
  • MSVMpack can now be compiled on Windows (by Emmanuel Didiot)
  • Fixed polynomial kernel
  • Minor bug fixes

Logo JMLR Information Theoretical Estimators 0.60

by szzoli - June 3, 2014, 00:17:33 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 54697 views, 11483 downloads, 2 subscriptions

About: ITE (Information Theoretical Estimators) is capable of estimating many different variants of entropy, mutual information, divergence, association measures, cross quantities and kernels on distributions. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems.

Changes:
  • Quick test on the Tsallis divergence: introduced.

  • Pearson chi square divergence estimation in the exponential family (MLE + analytical formula): added.


Logo JMLR Tapkee 1.0

by blackburn - April 10, 2014, 02:45:58 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6516 views, 1893 downloads, 1 subscription

About: Tapkee is an efficient and flexible C++ template library for dimensionality reduction.

Changes:

Initial Announcement on mloss.org.


Logo JMLR MOA Massive Online Analysis Nov-13

by abifet - April 4, 2014, 03:50:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 11953 views, 4723 downloads, 1 subscription

About: Massive Online Analysis (MOA) is a real time analytic tool for data streams. It is a software environment for implementing algorithms and running experiments for online learning from evolving data streams. MOA includes a collection of offline and online methods as well as tools for evaluation. In particular, it implements boosting, bagging, and Hoeffding Trees, all with and without Naive Bayes classifiers at the leaves. MOA supports bi-directional interaction with WEKA, the Waikato Environment for Knowledge Analysis, and it is released under the GNU GPL license.

Changes:

New version November 2013


Logo JMLR MultiBoost 1.2.02

by busarobi - March 31, 2014, 16:13:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 24770 views, 4332 downloads, 1 subscription

About: MultiBoost is a multi-purpose boosting package implemented in C++. It is based on the multi-class/multi-task AdaBoost.MH algorithm [Schapire-Singer, 1999]. Basic base learners (stumps, trees, products, Haar filters for image processing) can be easily complemented by new data representations and the corresponding base learners, without interfering with the main boosting engine.

Changes:

Major changes :

  • The “early stopping” feature can now based on any metric output with the --outputinfo command line argument.

  • Early stopping now works with --slowresume command line argument.

Minor fixes:

  • More informative output when testing.

  • Various compilation glitch with recent clang (OsX/Linux).


Logo JMLR EnsembleSVM 2.0

by claesenm - March 31, 2014, 08:06:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 5802 views, 2055 downloads, 2 subscriptions

About: The EnsembleSVM library offers functionality to perform ensemble learning using Support Vector Machine (SVM) base models. In particular, we offer routines for binary ensemble models using SVM base classifiers. Experimental results have shown the predictive performance to be comparable with standard SVM models but with drastically reduced training time. Ensemble learning with SVM models is particularly useful for semi-supervised tasks.

Changes:

The library has been updated and features a variety of new functionality as well as more efficient implementations of original features. The following key improvements have been made:

  1. Support for multithreading in training and prediction with ensemble models. Since both of these are embarassingly parallel, this has induced a significant speedup (3-fold on quad-core).
  2. Extensive programming framework for aggregation of base model predictions which allows highly efficient prototyping of new aggregation approaches. Additionally we provide several predefined strategies, including (weighted) majority voting, logistic regression and nonlinear SVMs of your choice -- be sure to check out the esvm-edit tool! The provided framework also allows you to efficiently program your own, novel aggregation schemes.
  3. Full code transition to C++11, the latest C++ standard, which enabled various performance improvements. The new release requires moderately recent compilers, such as gcc 4.7.2+ or clang 3.2+.
  4. Generic implementations of convenient facilities have been added, such as thread pools, deserialization factories and more.

The API and ABI have undergone significant changes, many of which are due to the transition to C++11.


Logo JMLR fastclime 1.2.3

by colin1898 - March 10, 2014, 08:54:41 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1853 views, 474 downloads, 1 subscription

About: The package "fastclime" provides a method of recover the precision matrix efficiently by applying parametric simplex method. The computation is based on a linear optimization solver. It also contains a generic LP solver and a parameterized LP solver using parametric simplex method.

Changes:

Initial Announcement on mloss.org.


Logo JMLR SHOGUN 3.2.0

by sonne - February 17, 2014, 20:31:36 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 87401 views, 12135 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarEmpty StarEmpty Star
(based on 6 votes)

About: The SHOGUN machine learning toolbox's focus is on large scale learning methods with focus on Support Vector Machines (SVM), providing interfaces to python, octave, matlab, r and the command line.

Changes:

This is mostly a bugfix release:

Features

  • Fully support python3 now
  • Add mini-batch k-means [Parijat Mazumdar]
  • Add k-means++ [Parijat Mazumdar]
  • Add sub-sequence string kernel [lambday]

Bugfixes

  • Compile fixes for upcoming swig3.0
  • Speedup for gaussian process' apply()
  • Improve unit / integration test checks
  • libbmrm uninitialized memory reads
  • libocas uninitialized memory reads
  • Octave 3.8 compile fixes [Orion Poplawski]
  • Fix java modular compile error [Bjoern Esser]

Logo JMLR BudgetedSVM v1.1

by nemanja - February 12, 2014, 20:53:45 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1592 views, 362 downloads, 1 subscription

About: BudgetedSVM is an open-source C++ toolbox for scalable non-linear classification. The toolbox can be seen as a missing link between LibLinear and LibSVM, combining the efficiency of linear with the accuracy of kernel SVM. We provide an Application Programming Interface for efficient training and testing of non-linear classifiers, supported by data structures designed for handling data which cannot fit in memory. We also provide command-line and Matlab interfaces, providing users with an efficient, easy-to-use tool for large-scale non-linear classification.

Changes:

Changed license from LGPL v3 to Modified BSD.


About: The CTBN-RLE is a C++ package of executables and libraries for inference and learning algorithms for continuous time Bayesian networks (CTBNs).

Changes:

compilation problems fixed


About: The glm-ie toolbox contains scalable estimation routines for GLMs (generalised linear models) and SLMs (sparse linear models) as well as an implementation of a scalable convex variational Bayesian inference relaxation. We designed the glm-ie package to be simple, generic and easily expansible. Most of the code is written in Matlab including some MEX files. The code is fully compatible to both Matlab 7.x and GNU Octave 3.2.x. Probabilistic classification, sparse linear modelling and logistic regression are covered in a common algorithmical framework allowing for both MAP estimation and approximate Bayesian inference.

Changes:

added factorial mean field inference as a third algorithm complementing expectation propagation and variational Bayes

generalised non-Gaussian potentials so that affine instead of linear functions of the latent variables can be used


Logo JMLR CARP 3.3

by volmeln - November 7, 2013, 15:48:06 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 15021 views, 4809 downloads, 1 subscription

About: CARP: The Clustering Algorithms’ Referee Package

Changes:

Generalized overlap error and some bugs have been fixed


Showing Items 1-20 of 43 on page 1 of 3: 1 2 3 Next