Projects that also appeared in JMLR.
Showing Items 1-20 of 44 on page 1 of 3: 1 2 3 Next

Logo JMLR MLPACK 3.0.2

by rcurtin - June 9, 2018, 18:03:57 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 224233 views, 39221 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 1 vote)

About: A fast, flexible C++ machine learning library, with bindings to other languages.

Changes:

Released June 8th, 2018.

  • Documentation generation fixes for Python bindings (#1421).
  • Fix build error for man pages if command-line bindings are not being built (#1424).
  • Add shuffle parameter and Shuffle() method to KFoldCV (#1412). This will shuffle the data when the object is constructed, or when Shuffle() is called.
  • Added neural network layers: AtrousConvolution (#1390), Embedding (#1401), and LayerNorm (layer normalization) (#1389).
  • Add Pendulum environment for reinforcement learning (#1388) and update Mountain Car environment (#1394).

Logo JMLR dlib ml 19.11

by davis685 - May 18, 2018, 04:19:52 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 421352 views, 77404 downloads, 0 subscriptions

About: This project is a C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real world problems.

Changes:

This release adds a bunch of new image processing routines as well as many minor usability improvements and bug fixes.


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 4.1

by hn - November 27, 2017, 19:26:13 CET [ Project Homepage BibTeX Download ] 85790 views, 19103 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave/Matlab implementation of inference and prediction with Gaussian process models. The toolbox offers exact inference, approximate inference for non-Gaussian likelihoods (Laplace's Method, Expectation Propagation, Variational Bayes) as well for large datasets (FITC, VFE, KISS-GP). A wide range of covariance, likelihood, mean and hyperprior functions allows to create very complex GP models.

Changes:

Logdet-estimation functionality for grid-based approximate covariances

  • Lanczos subspace estimation

  • Chebyshef polynomial expansion

More generic infEP functionality

  • dense computations and sparse approximations using the same code

  • covering KL inference as a special cas of EP

New infKL function contributed by Emtiyaz Khan and Wu Lin

  • Conjugate-Computation Variational Inference algorithm

  • much more scalable than previous versions

Time-series covariance functions on the positive real line

  • covW (i-times integrated) Wiener process covariance

  • covOU (i-times integrated) Ornstein-Uhlenbeck process covariance (contributed by Juan Pablo Carbajal)

  • covULL underdamped linear Langevin process covariance (contributed by Robert MacKay)

  • covFBM Fractional Brownian motion covariance

New covariance functions

  • covWarp implements k(w(x),w(z)) where w is a "warping" function

  • covMatern has been extended to also accept non-integer distance parameters


Logo JMLR Jstacs 2.3

by keili - September 13, 2017, 14:25:38 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 62374 views, 13766 downloads, 0 subscriptions

About: A Java framework for statistical analysis and classification of biological sequences

Changes:

New classes and packages:

  • Jstacs 2.3 is the first release to be accompanied by JstacsFX, a library for building JavaFX-based graphical user interfaces based on JstacsTools
  • new interface MultiThreadedFunction
  • new class LargeSequenceReader for reading large sequence files in chunks
  • new interface QuickScanningSequenceScore
  • new class RegExpValidator for checking String inputs against a regular expression
  • new class IUPACDNAAlphabet

New features and improvements:

  • Alignments may now handle different costs for insert and delete gaps
  • ListResults may now be constructed from Collections of ResultSets
  • Several minor improvements and bugfixes in many classes
  • Improvements of documentation of several classes

Logo JMLR MSVMpack 1.5.1

by lauerfab - March 9, 2017, 12:29:37 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 62041 views, 15104 downloads, 0 subscriptions

About: MSVMpack is a Multi-class Support Vector Machine (M-SVM) package. It is dedicated to SVMs which can handle more than two classes without relying on decomposition methods and implements the four M-SVM models from the literature: Weston and Watkins M-SVM, Crammer and Singer M-SVM, Lee, Lin and Wahba M-SVM, and the M-SVM2 of Guermeur and Monfrini.

Changes:
  • Fix compilation error with recent gcc

Logo JMLR scikitlearn 0.18.1

by fabianp - November 28, 2016, 17:45:27 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 64455 views, 20978 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 3 votes)

About: The scikit-learn project is a machine learning library in Python.

Changes:

Update for 0.18 .1


Logo JMLR Information Theoretical Estimators 0.63

by szzoli - June 9, 2016, 23:42:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 312485 views, 58560 downloads, 0 subscriptions

About: ITE (Information Theoretical Estimators) is capable of estimating many different variants of entropy, mutual information, divergence, association measures, cross quantities and kernels on distributions. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems.

Changes:
  • Conditional Shannon entropy estimation: added.

  • Conditional Shannon mutual information estimation: included.


Logo JMLR GPstuff 4.7

by avehtari - June 9, 2016, 17:45:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 88320 views, 20268 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2016-06-09 Version 4.7

Development and release branches available at https://github.com/gpstuff-dev/gpstuff

New features

  • Simple Bayesian Optimization demo

Improvements

  • Improved use of PSIS
  • More options added to gp_monotonic
  • Monotonicity now works for additive covariance functions with selected variables
  • Possibility to use gpcf_squared.m-covariance function with derivative observations/monotonicity
  • Default behaviour made more robust by changing default jitter from 1e-9 to 1e-6
  • LA-LOO uses the cavity method as the default (see Vehtari et al (2016). Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models. JMLR, accpeted for publication)
  • Selected variables -option works now better with monotonicity

Bugfixes

  • small error in derivative observation computation fixed
  • several minor bug fixes

Logo JMLR JKernelMachines 3.0

by dpicard - May 4, 2016, 17:53:28 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 95294 views, 19345 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 4 votes)

About: machine learning library in java for easy development of new kernels and kernel algorithms

Changes:

Version 3.0

/! Warning: this version is incompatible with previous code

  • change license to BSD 3-clauses
  • change package name to net.jkernelmachines
  • change to maven build system (available through central)
  • online training interfaces to allow continuous online learning
  • add a new budget oriented kernel classifier
  • new kernel and processing especially for strings

Logo JMLR Darwin 1.9

by sgould - September 8, 2015, 06:50:37 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 133752 views, 29995 downloads, 0 subscriptions

About: A platform-independent C++ framework for machine learning, graphical models, and computer vision research and development.

Changes:

Version 1.9:

  • Replaced drwnInPaint class with drwnImageInPainter class and added inPaint application
  • Added function to read CIFAR-10 and CIFAR-100 style datasets (see http://www.cs.utoronto.ca/~kriz/cifar.html)
  • Added drwnMaskedPatchMatch, drwnBasicPatchMatch, drwnSelfPatchMatch and basicPatchMatch application
  • drwnPatchMatchGraph now allows multiple matches to the same image
  • Upgraded wxWidgets to 3.0.2 (problems on Mac OS X)
  • Switched Mac OS X compilation to libc++ instead of libstdc++
  • Added Python scripts for running experiments and regression tests
  • Refactored drwnGrabCutInstance class to support both GMM and colour histogram model
  • Added cacheSortIndex to drwnDecisionTree for trading-off speed versus memory usage
  • Added mexLoadPatchMatchGraph for loading drwnPatchMatchGraph objects into Matlab
  • Improved documentation, other bug fixes and performance improvements

Logo JMLR libDAI 0.3.2

by jorism - July 17, 2015, 15:59:55 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 93568 views, 18267 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: libDAI provides free & open source implementations of various (approximate) inference methods for graphical models with discrete variables, including Bayesian networks and Markov Random Fields.

Changes:

Release 0.3.2 fixes various bugs and adds GLC (Generalized Loop Corrections) written by Siamak Ravanbakhsh.


Logo JMLR Sally 1.0.0

by konrad - March 26, 2015, 17:01:35 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 110340 views, 21117 downloads, 0 subscriptions

About: A Tool for Embedding Strings in Vector Spaces

Changes:

Support for explicit selection of granularity added. Several minor bug fixes. We have reached 1.0


Logo JMLR Mulan 1.5.0

by lefman - February 23, 2015, 21:19:05 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 54738 views, 14656 downloads, 0 subscriptions

About: Mulan is an open-source Java library for learning from multi-label datasets. Multi-label datasets consist of training examples of a target function that has multiple binary target variables. This means that each item of a multi-label dataset can be a member of multiple categories or annotated by many labels (classes). This is actually the nature of many real world problems such as semantic annotation of images and video, web page categorization, direct marketing, functional genomics and music categorization into genres and emotions.

Changes:

Learners

  • MLCSSP.java: Added the MLCSSP algorithm (from ICML 2013)
  • Enhancements of multi-target regression capabilities
  • Improved CLUS support
  • Added pairwise classifier and pairwise transformation

Measures/Evaluation

  • Providing training data in the Evaluator is unnecessary in the case of specific measures.
  • Examples with missing ground truth are not skipped for measures that handle missing values.
  • Added logistics and squared error losses and measures

Bug fixes

  • IndexOutOfBounds in calculation of MiAP and GMiAP
  • Bug fix in Rcut.java
  • When in rank/score mode the meta-data contained additional unecessary attributes. (Newton Spolaor)

API changes

  • Upgrade to Java 7
  • Upgrade to Weka 3.7.10

Miscalleneous

  • Small changes and improvements in the wrapper classes for the CLUS library
  • ENTCS13FeatureSelection.java (new experiment)
  • Enumeration is now used for specifying the type of meta-data. (Newton Spolaor)

Logo JMLR DLLearner 1.0

by Jens - February 13, 2015, 11:39:46 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 45809 views, 9513 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 3 votes)

About: The DL-Learner framework contains several algorithms for supervised concept learning in Description Logics (DLs) and OWL.

Changes:

See http://dl-learner.org/development/changelog/.


Logo JMLR SHOGUN 4.0.0

by sonne - February 5, 2015, 09:09:37 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 210221 views, 33758 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarEmpty StarEmpty Star
(based on 6 votes)

About: The SHOGUN machine learning toolbox's focus is on large scale learning methods with focus on Support Vector Machines (SVM), providing interfaces to python, octave, matlab, r and the command line.

Changes:

This release features the work of our 8 GSoC 2014 students [student; mentors]:

  • OpenCV Integration and Computer Vision Applications [Abhijeet Kislay; Kevin Hughes]
  • Large-Scale Multi-Label Classification [Abinash Panda; Thoralf Klein]
  • Large-scale structured prediction with approximate inference [Jiaolong Xu; Shell Hu]
  • Essential Deep Learning Modules [Khaled Nasr; Sergey Lisitsyn, Theofanis Karaletsos]
  • Fundamental Machine Learning: decision trees, kernel density estimation [Parijat Mazumdar ; Fernando Iglesias]
  • Shogun Missionary & Shogun in Education [Saurabh Mahindre; Heiko Strathmann]
  • Testing and Measuring Variable Interactions With Kernels [Soumyajit De; Dino Sejdinovic, Heiko Strathmann]
  • Variational Learning for Gaussian Processes [Wu Lin; Heiko Strathmann, Emtiyaz Khan]

It also contains several cleanups and bugfixes:

Features

  • New Shogun project description [Heiko Strathmann]
  • ID3 algorithm for decision tree learning [Parijat Mazumdar]
  • New modes for PCA matrix factorizations: SVD & EVD, in-place or reallocating [Parijat Mazumdar]
  • Add Neural Networks with linear, logistic and softmax neurons [Khaled Nasr]
  • Add kernel multiclass strategy examples in multiclass notebook [Saurabh Mahindre]
  • Add decision trees notebook containing examples for ID3 algorithm [Parijat Mazumdar]
  • Add sudoku recognizer ipython notebook [Alejandro Hernandez]
  • Add in-place subsets on features, labels, and custom kernels [Heiko Strathmann]
  • Add Principal Component Analysis notebook [Abhijeet Kislay]
  • Add Multiple Kernel Learning notebook [Saurabh Mahindre]
  • Add Multi-Label classes to enable Multi-Label classification [Thoralf Klein]
  • Add rectified linear neurons, dropout and max-norm regularization to neural networks [Khaled Nasr]
  • Add C4.5 algorithm for multiclass classification using decision trees [Parijat Mazumdar]
  • Add support for arbitrary acyclic graph-structured neural networks [Khaled Nasr]
  • Add CART algorithm for classification and regression using decision trees [Parijat Mazumdar]
  • Add CHAID algorithm for multiclass classification and regression using decision trees [Parijat Mazumdar]
  • Add Convolutional Neural Networks [Khaled Nasr]
  • Add Random Forests algorithm for ensemble learning using CART [Parijat Mazumdar]
  • Add Restricted Botlzmann Machines [Khaled Nasr]
  • Add Stochastic Gradient Boosting algorithm for ensemble learning [Parijat Mazumdar]
  • Add Deep contractive and denoising autoencoders [Khaled Nasr]
  • Add Deep belief networks [Khaled Nasr]

Bugfixes

  • Fix reference counting bugs in CList when reference counting is on [Heiko Strathmann, Thoralf Klein, lambday]
  • Fix memory problem in PCA::apply_to_feature_matrix [Parijat Mazumdar]
  • Fix crash in LeastAngleRegression for the case D greater than N [Parijat Mazumdar]
  • Fix memory violations in bundle method solvers [Thoralf Klein]
  • Fix fail in library_mldatahdf5.cpp example when http://mldata.org is not working properly [Parijat Mazumdar]
  • Fix memory leaks in Vowpal Wabbit, LibSVMFile and KernelPCA [Thoralf Klein]
  • Fix memory and control flow issues discovered by Coverity [Thoralf Klein]
  • Fix R modular interface SWIG typemap (Requires SWIG >= 2.0.5) [Matt Huska]

Cleanup and API Changes

  • PCA now depends on Eigen3 instead of LAPACK [Parijat Mazumdar]
  • Removing redundant and fixing implicit imports [Thoralf Klein]
  • Hide many methods from SWIG, reducing compile memory by 500MiB [Heiko Strathmann, Fernando Iglesias, Thoralf Klein]

Logo JMLR RL library 3.00.00

by frezza - January 13, 2015, 04:15:16 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 13017 views, 3394 downloads, 0 subscriptions

About: A template based C++ reinforcement learning library

Changes:

Initial Announcement on mloss.org.


Logo JMLR Waffles 2014-07-05

by mgashler - July 20, 2014, 04:53:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 81489 views, 18971 downloads, 0 subscriptions

About: Script-friendly command-line tools for machine learning and data mining tasks. (The command-line tools wrap functionality from a public domain C++ class library.)

Changes:

Added support for CUDA GPU-parallelized neural network layers, and several other new features. Full list of changes at http://waffles.sourceforge.net/docs/changelog.html


Logo JMLR Tapkee 1.0

by blackburn - April 10, 2014, 02:45:58 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 31793 views, 7839 downloads, 0 subscriptions

About: Tapkee is an efficient and flexible C++ template library for dimensionality reduction.

Changes:

Initial Announcement on mloss.org.


Logo JMLR MOA Massive Online Analysis Nov-13

by abifet - April 4, 2014, 03:50:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 35747 views, 10064 downloads, 0 subscriptions

About: Massive Online Analysis (MOA) is a real time analytic tool for data streams. It is a software environment for implementing algorithms and running experiments for online learning from evolving data streams. MOA includes a collection of offline and online methods as well as tools for evaluation. In particular, it implements boosting, bagging, and Hoeffding Trees, all with and without Naive Bayes classifiers at the leaves. MOA supports bi-directional interaction with WEKA, the Waikato Environment for Knowledge Analysis, and it is released under the GNU GPL license.

Changes:

New version November 2013


Logo JMLR MultiBoost 1.2.02

by busarobi - March 31, 2014, 16:13:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 87133 views, 14333 downloads, 0 subscriptions

About: MultiBoost is a multi-purpose boosting package implemented in C++. It is based on the multi-class/multi-task AdaBoost.MH algorithm [Schapire-Singer, 1999]. Basic base learners (stumps, trees, products, Haar filters for image processing) can be easily complemented by new data representations and the corresponding base learners, without interfering with the main boosting engine.

Changes:

Major changes :

  • The “early stopping” feature can now based on any metric output with the --outputinfo command line argument.

  • Early stopping now works with --slowresume command line argument.

Minor fixes:

  • More informative output when testing.

  • Various compilation glitch with recent clang (OsX/Linux).


Showing Items 1-20 of 44 on page 1 of 3: 1 2 3 Next