Projects that also appeared in JMLR.
Showing Items 1-20 of 43 on page 1 of 3: 1 2 3 Next

Logo JMLR Darwin 1.8

by sgould - September 3, 2014, 08:42:53 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 28845 views, 6042 downloads, 4 subscriptions

About: A platform-independent C++ framework for machine learning, graphical models, and computer vision research and development.

Changes:

Version 1.8:

  • Added Superpixel Graph Label Transfer (nnGraph) Project project
  • Added Python scripts for automating some projects
  • Added ability to pre-process features on-the-fly with one drwnFeatureTransform when applying or learning another drwnFeatureTransform
  • Fixed race condition in Windows threading (thanks to Edison Guo)
  • Switched Windows and Linux to build against OpenCV 2.4.9
  • Changed drwnMAPInference::inference to return upper and lower energy bounds
  • Added pruneRounds function to drwnBoostedClassifier
  • Added drwnSLICSuperpixels function
  • Added drwnIndexQueue class
  • mexLearnClassifier and mexAnalyseClassifier now support integer label types
  • Bug fix in mexSaveSuperpixels to support single channel

Logo JMLR MLPACK 1.0.10

by rcurtin - August 29, 2014, 21:26:18 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 33401 views, 6640 downloads, 6 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 1 vote)

About: A scalable, fast C++ machine learning library, with emphasis on usability.

Changes:
  • Bugfix for NeighborSearch regression which caused very slow allknn/allkfn. Speeds are nwo restored to approximately 1.0.8 speeds, with significant improvement for the cover tree (#365).
  • Detect dependencies correctly when ARMA_USE_WRAPPER is not defined (i.e. libarmadillo.so does not exist).
  • Bugfix for compilation under Visual Studio (#366).

Logo JMLR dlib ml 18.10

by davis685 - August 29, 2014, 02:56:23 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 82961 views, 14354 downloads, 2 subscriptions

About: This project is a C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real world problems.

Changes:

In addition to a number of usability improvements, this release adds an implementation of the recent paper "One Millisecond Face Alignment with an Ensemble of Regression Trees" by Vahid Kazemi and Josephine Sullivan. This includes tools for performing high quality face landmarking as well as tools for training new landmarking models. See the face_landmark_detection_ex.cpp and train_shape_predictor_ex.cpp example programs for an introduction.


Logo JMLR JKernelMachines 2.4

by dpicard - July 24, 2014, 13:51:44 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12323 views, 3108 downloads, 2 subscriptions

Rating Whole StarWhole Star1/2 StarEmpty StarEmpty Star
(based on 1 vote)

About: machine learning library in java for easy development of new kernels

Changes:

Version 2.4

  • Added a simple GUI to rapidly test some algorithms
  • New Active Learning package
  • New algorithms (LLSVM, KMeans)
  • New Kernels (Polynomials, component wise)
  • Many bugfixes and improvements to existing algorithms
  • Many optimization

The number of changes in this version is massive, test it! Don't forget to report any regression.


Logo JMLR GPstuff 4.5

by avehtari - July 22, 2014, 14:03:11 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 14121 views, 3512 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2014-07-22 Version 4.5

New features

  • Input dependent noise and signal variance.

    • Tolvanen, V., Jylänki, P. and Vehtari, A. (2014). Expectation Propagation for Nonstationary Heteroscedastic Gaussian Process Regression. In Proceedings of IEEE International Workshop on Machine Learning for Signal Processing, accepted for publication. Preprint http://arxiv.org/abs/1404.5443
  • Sparse stochastic variational inference model.

    • Hensman, J., Fusi, N. and Lawrence, N. D. (2013). Gaussian processes for big data. arXiv preprint http://arxiv.org/abs/1309.6835.
  • Option 'autoscale' in the gp_rnd.m to get split normal approximated samples from the posterior predictive distribution of the latent variable.

    • Geweke, J. (1989). Bayesian Inference in Econometric Models Using Monte Carlo Integration. Econometrica, 57(6):1317-1339.

    • Villani, M. and Larsson, R. (2006). The Multivariate Split Normal Distribution and Asymmetric Principal Components Analysis. Communications in Statistics - Theory and Methods, 35(6):1123-1140.

Improvements

  • New unit test environment using the Matlab built-in test framework (the old Xunit package is still also supported).
  • Precomputed demo results (including the figures) are now available in the folder tests/realValues.
  • New demos demonstrating new features etc.
    • demo_epinf, demonstrating the input dependent noise and signal variance model
    • demo_svi_regression, demo_svi_classification
    • demo_modelcomparison2, demo_survival_comparison

Several minor bugfixes


Logo JMLR Waffles 2014-07-05

by mgashler - July 20, 2014, 04:53:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 24090 views, 7097 downloads, 2 subscriptions

About: Script-friendly command-line tools for machine learning and data mining tasks. (The command-line tools wrap functionality from a public domain C++ class library.)

Changes:

Added support for CUDA GPU-parallelized neural network layers, and several other new features. Full list of changes at http://waffles.sourceforge.net/docs/changelog.html


Logo JMLR MSVMpack 1.5

by lauerfab - July 3, 2014, 16:02:49 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 11643 views, 3931 downloads, 2 subscriptions

About: MSVMpack is a Multi-class Support Vector Machine (M-SVM) package. It is dedicated to SVMs which can handle more than two classes without relying on decomposition methods and implements the four M-SVM models from the literature: Weston and Watkins M-SVM, Crammer and Singer M-SVM, Lee, Lin and Wahba M-SVM, and the M-SVM2 of Guermeur and Monfrini.

Changes:
  • Windows binaries are now included (by Emmanuel Didiot)
  • MSVMpack can now be compiled on Windows (by Emmanuel Didiot)
  • Fixed polynomial kernel
  • Minor bug fixes

Logo JMLR Sally 0.9.0

by konrad - July 1, 2014, 22:43:51 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 20391 views, 4128 downloads, 2 subscriptions

About: A Tool for Embedding Strings in Vector Spaces

Changes:

Support for hash-based dimension reduction: simhash, minhash and Bloom filter. Support for several n-gram variants: regular, sorted, positional and blended n-grams. Simplified configuration.


Logo JMLR Information Theoretical Estimators 0.60

by szzoli - June 3, 2014, 00:17:33 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 49413 views, 10489 downloads, 2 subscriptions

About: ITE (Information Theoretical Estimators) is capable of estimating many different variants of entropy, mutual information, divergence, association measures, cross quantities and kernels on distributions. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems.

Changes:
  • Quick test on the Tsallis divergence: introduced.

  • Pearson chi square divergence estimation in the exponential family (MLE + analytical formula): added.


Logo JMLR Tapkee 1.0

by blackburn - April 10, 2014, 02:45:58 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6018 views, 1730 downloads, 1 subscription

About: Tapkee is an efficient and flexible C++ template library for dimensionality reduction.

Changes:

Initial Announcement on mloss.org.


Logo JMLR MOA Massive Online Analysis Nov-13

by abifet - April 4, 2014, 03:50:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 11415 views, 4478 downloads, 1 subscription

About: Massive Online Analysis (MOA) is a real time analytic tool for data streams. It is a software environment for implementing algorithms and running experiments for online learning from evolving data streams. MOA includes a collection of offline and online methods as well as tools for evaluation. In particular, it implements boosting, bagging, and Hoeffding Trees, all with and without Naive Bayes classifiers at the leaves. MOA supports bi-directional interaction with WEKA, the Waikato Environment for Knowledge Analysis, and it is released under the GNU GPL license.

Changes:

New version November 2013


Logo JMLR MultiBoost 1.2.02

by busarobi - March 31, 2014, 16:13:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 23421 views, 4118 downloads, 1 subscription

About: MultiBoost is a multi-purpose boosting package implemented in C++. It is based on the multi-class/multi-task AdaBoost.MH algorithm [Schapire-Singer, 1999]. Basic base learners (stumps, trees, products, Haar filters for image processing) can be easily complemented by new data representations and the corresponding base learners, without interfering with the main boosting engine.

Changes:

Major changes :

  • The “early stopping” feature can now based on any metric output with the --outputinfo command line argument.

  • Early stopping now works with --slowresume command line argument.

Minor fixes:

  • More informative output when testing.

  • Various compilation glitch with recent clang (OsX/Linux).


Logo JMLR EnsembleSVM 2.0

by claesenm - March 31, 2014, 08:06:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 5177 views, 1826 downloads, 2 subscriptions

About: The EnsembleSVM library offers functionality to perform ensemble learning using Support Vector Machine (SVM) base models. In particular, we offer routines for binary ensemble models using SVM base classifiers. Experimental results have shown the predictive performance to be comparable with standard SVM models but with drastically reduced training time. Ensemble learning with SVM models is particularly useful for semi-supervised tasks.

Changes:

The library has been updated and features a variety of new functionality as well as more efficient implementations of original features. The following key improvements have been made:

  1. Support for multithreading in training and prediction with ensemble models. Since both of these are embarassingly parallel, this has induced a significant speedup (3-fold on quad-core).
  2. Extensive programming framework for aggregation of base model predictions which allows highly efficient prototyping of new aggregation approaches. Additionally we provide several predefined strategies, including (weighted) majority voting, logistic regression and nonlinear SVMs of your choice -- be sure to check out the esvm-edit tool! The provided framework also allows you to efficiently program your own, novel aggregation schemes.
  3. Full code transition to C++11, the latest C++ standard, which enabled various performance improvements. The new release requires moderately recent compilers, such as gcc 4.7.2+ or clang 3.2+.
  4. Generic implementations of convenient facilities have been added, such as thread pools, deserialization factories and more.

The API and ABI have undergone significant changes, many of which are due to the transition to C++11.


Logo JMLR fastclime 1.2.3

by colin1898 - March 10, 2014, 08:54:41 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1583 views, 397 downloads, 1 subscription

About: The package "fastclime" provides a method of recover the precision matrix efficiently by applying parametric simplex method. The computation is based on a linear optimization solver. It also contains a generic LP solver and a parameterized LP solver using parametric simplex method.

Changes:

Initial Announcement on mloss.org.


Logo JMLR SHOGUN 3.2.0

by sonne - February 17, 2014, 20:31:36 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 85151 views, 11811 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarEmpty StarEmpty Star
(based on 6 votes)

About: The SHOGUN machine learning toolbox's focus is on large scale learning methods with focus on Support Vector Machines (SVM), providing interfaces to python, octave, matlab, r and the command line.

Changes:

This is mostly a bugfix release:

Features

  • Fully support python3 now
  • Add mini-batch k-means [Parijat Mazumdar]
  • Add k-means++ [Parijat Mazumdar]
  • Add sub-sequence string kernel [lambday]

Bugfixes

  • Compile fixes for upcoming swig3.0
  • Speedup for gaussian process' apply()
  • Improve unit / integration test checks
  • libbmrm uninitialized memory reads
  • libocas uninitialized memory reads
  • Octave 3.8 compile fixes [Orion Poplawski]
  • Fix java modular compile error [Bjoern Esser]

Logo JMLR BudgetedSVM v1.1

by nemanja - February 12, 2014, 20:53:45 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1371 views, 271 downloads, 1 subscription

About: BudgetedSVM is an open-source C++ toolbox for scalable non-linear classification. The toolbox can be seen as a missing link between LibLinear and LibSVM, combining the efficiency of linear with the accuracy of kernel SVM. We provide an Application Programming Interface for efficient training and testing of non-linear classifiers, supported by data structures designed for handling data which cannot fit in memory. We also provide command-line and Matlab interfaces, providing users with an efficient, easy-to-use tool for large-scale non-linear classification.

Changes:

Changed license from LGPL v3 to Modified BSD.


About: The CTBN-RLE is a C++ package of executables and libraries for inference and learning algorithms for continuous time Bayesian networks (CTBNs).

Changes:

compilation problems fixed


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.4

by hn - November 11, 2013, 14:46:52 CET [ Project Homepage BibTeX Download ] 19199 views, 4545 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

Changes:
  • derivatives w.r.t. inducing points xu in infFITC, infFITC_Laplace, infFITC_EP so that one can treat the inducing points either as fixed given quantities or as additional hyperparameters
  • new GLM likelihood likExp for inter-arrival time modeling
  • new GLM likelihood likWeibull for extremal value regression
  • new GLM likelihood likGumbel for extremal value regression
  • new mean function meanPoly depending polynomially on the data
  • infExact can deal safely with the zero noise variance limit
  • support of GP warping through the new likelihood function likGaussWarp

About: The glm-ie toolbox contains scalable estimation routines for GLMs (generalised linear models) and SLMs (sparse linear models) as well as an implementation of a scalable convex variational Bayesian inference relaxation. We designed the glm-ie package to be simple, generic and easily expansible. Most of the code is written in Matlab including some MEX files. The code is fully compatible to both Matlab 7.x and GNU Octave 3.2.x. Probabilistic classification, sparse linear modelling and logistic regression are covered in a common algorithmical framework allowing for both MAP estimation and approximate Bayesian inference.

Changes:

added factorial mean field inference as a third algorithm complementing expectation propagation and variational Bayes

generalised non-Gaussian potentials so that affine instead of linear functions of the latent variables can be used


Logo JMLR CARP 3.3

by volmeln - November 7, 2013, 15:48:06 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 14345 views, 4595 downloads, 1 subscription

About: CARP: The Clustering Algorithms’ Referee Package

Changes:

Generalized overlap error and some bugs have been fixed


Showing Items 1-20 of 43 on page 1 of 3: 1 2 3 Next