All entries.
Showing Items 171-180 of 638 on page 18 of 64: First Previous 13 14 15 16 17 18 19 20 21 22 23 Next Last

Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 4.0

by hn - October 19, 2016, 10:15:05 CET [ Project Homepage BibTeX Download ] 42486 views, 9425 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave/Matlab implementation of inference and prediction with Gaussian process models. The toolbox offers exact inference, approximate inference for non-Gaussian likelihoods (Laplace's Method, Expectation Propagation, Variational Bayes) as well for large datasets (FITC, VFE, KISS-GP). A wide range of covariance, likelihood, mean and hyperprior functions allows to create very complex GP models.

Changes:

A major code restructuring effort did take place in the current release unifying certain inference functions and allowing more flexibility in covariance function composition. We also redesigned the whole derivative computation pipeline to strongly improve the overall runtime. We finally include grid-based covariance approximations natively.

More generic sparse approximation using Power EP

  • unified treatment of FITC approximation, variational approaches VFE and hybrids

  • inducing input optimisation for all (compositions of) covariance functions dropping the previous limitation to a few standard examples

  • infFITC is now covered by the more generic infGaussLik function

Approximate covariance object unifying sparse approximations, grid-based approximations and exact covariance computations

  • implementation in cov/apx, cov/apxGrid, cov/apxSparse

  • generic infGaussLik unifies infExact, infFITC and infGrid

  • generic infLaplace unifies infLaplace, infFITC_Laplace and infGrid_Laplace

Hiearchical structure of covariance functions

  • clear hierachical compositional implementation

  • no more code duplication as present in covSEiso and covSEard pairs

  • two mother covariance functions

    • covDot for dot-product-based covariances and

    • covMaha for Mahalanobis-distance-based covariances

  • a variety of modifiers: eye, iso, ard, proj, fact, vlen

  • more flexibility as more variants are available and possible

  • all covariance functions offer derivatives w.r.t. inputs

Faster derivative computations for mean and cov functions

  • switched from partial derivatives to directional derivatives

  • simpler and more concise interface of mean and cov functions

  • much faster marginal likelihood derivative computations

  • simpler and more compact code

New mean functions

  • new mean/meanWSPC (Weighted Sum of Projected Cosines or Random Kitchen Sink features) following a suggestion by William Herlands

  • new mean/meanWarp for constructing a new mean from an existing one by means of a warping function adapted from William Herlands

New optimizer

  • added a new minimize_minfunc, contributed by Truong X. Nghiem

New GLM link function

  • added the twice logistic link function util/glm_invlink_logistic2

Smaller fixes

  • two-fold speedup of util/elsympol used by covADD by Truong X. Nghiem

  • bugfix in util/logphi as reported by John Darby


Logo JMLR GPstuff 4.7

by avehtari - June 9, 2016, 17:45:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 42016 views, 10310 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2016-06-09 Version 4.7

Development and release branches available at https://github.com/gpstuff-dev/gpstuff

New features

  • Simple Bayesian Optimization demo

Improvements

  • Improved use of PSIS
  • More options added to gp_monotonic
  • Monotonicity now works for additive covariance functions with selected variables
  • Possibility to use gpcf_squared.m-covariance function with derivative observations/monotonicity
  • Default behaviour made more robust by changing default jitter from 1e-9 to 1e-6
  • LA-LOO uses the cavity method as the default (see Vehtari et al (2016). Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models. JMLR, accpeted for publication)
  • Selected variables -option works now better with monotonicity

Bugfixes

  • small error in derivative observation computation fixed
  • several minor bug fixes

Logo GPUML GPUs for kernel machines 4

by balajivasan - February 26, 2010, 18:12:46 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 7555 views, 1394 downloads, 1 subscription

About: GPUML is a library that provides a C/C++ and MATLAB interface for speeding up the computation of the weighted kernel summation and kernel matrix construction on GPU. These computations occur commonly in several machine learning algorithms like kernel density estimation, kernel regression, kernel PCA, etc.

Changes:

Initial Announcement on mloss.org.


Logo GradMC 2.00

by tur - April 14, 2014, 15:48:48 CET [ BibTeX Download ] 4705 views, 1484 downloads, 1 subscription

About: GradMC is an algorithm for MR motion artifact removal implemented in Matlab

Changes:

Added support for multi-rigid motion correction.


Logo Graph kernel based on iterative graph similarity and optimal assignments 2008-01-15

by mrupp - September 22, 2008, 13:42:28 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10435 views, 1835 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 1 vote)

About: Java package implementing a kernel for (molecular) graphs based on iterative graph similarity and optimal assignments.

Changes:

Initial Announcement on mloss.org.


Logo Graph Learning Package 0.1

by hiroto - May 4, 2009, 17:07:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 9964 views, 1895 downloads, 0 subscriptions

About: This software is aimed at performing supervised/unsupervised learning on graph data, where each graph is represented as binary indicators of subgraph features.

Changes:

Initial Announcement on mloss.org.


Logo GraphDemo 1.0

by ule - November 27, 2007, 20:11:21 CET [ Project Homepage BibTeX Download ] 5763 views, 1538 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole Star1/2 StarEmpty Star
(based on 3 votes)

About: The GraphDemo provides Matlab GUIs to explore similarity graphs and their use in machine learning. It aims to highlight the behavior of different kinds of similarity graphs and to demonstrate their [...]

Changes:

Initial Announcement on mloss.org.


Logo Graphical Models and Conditional Random Fields Toolbox 2

by jdomke - January 5, 2012, 15:38:20 CET [ Project Homepage BibTeX Download ] 4619 views, 1042 downloads, 1 subscription

About: This is a Matlab/C++ "toolbox" of code for learning and inference with graphical models. It is focused on parameter learning using marginalization in the high-treewidth setting.

Changes:

Initial Announcement on mloss.org.


Logo GraphLab v1-1908

by dannybickson - November 22, 2011, 12:50:00 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 8519 views, 1367 downloads, 1 subscription

About: Multicore/distributed large scale machine learning framework.

Changes:

Update version.


Logo GritBot 2.01

by zenog - September 2, 2011, 14:56:26 CET [ Project Homepage BibTeX Download ] 4054 views, 1032 downloads, 1 subscription

About: GritBot is an data cleaning and outlier/anomaly detection program.

Changes:

Initial Announcement on mloss.org.


Showing Items 171-180 of 638 on page 18 of 64: First Previous 13 14 15 16 17 18 19 20 21 22 23 Next Last