All entries.
Showing Items 171-180 of 645 on page 18 of 65: First Previous 13 14 15 16 17 18 19 20 21 22 23 Next Last

Logo GP RTSS 1.0

by marc - March 21, 2012, 08:43:52 CET [ BibTeX BibTeX for corresponding Paper Download ] 4481 views, 1358 downloads, 1 subscription

About: Gaussian process RTS smoothing (forward-backward smoothing) based on moment matching.

Changes:

Initial Announcement on mloss.org.


Logo GPDT Gradient Projection Decomposition Technique 1.01

by sezaza - December 21, 2007, 20:10:43 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10991 views, 2044 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 1 vote)

About: This is a C++ software designed to train large-scale SVMs for binary classification. The algorithm is also implemented in parallel (**PGPDT**) for distributed memory, strictly coupled multiprocessor [...]

Changes:

Initial Announcement on mloss.org.


Logo GPgrid toolkit for fast GP analysis on grid input 0.1

by ejg20 - September 16, 2013, 18:01:16 CET [ BibTeX Download ] 3396 views, 1129 downloads, 1 subscription

About: GPgrid toolkit for fast GP analysis on grid input

Changes:

Initial Announcement on mloss.org.


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 4.0

by hn - October 19, 2016, 10:15:05 CET [ Project Homepage BibTeX Download ] 45326 views, 10026 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave/Matlab implementation of inference and prediction with Gaussian process models. The toolbox offers exact inference, approximate inference for non-Gaussian likelihoods (Laplace's Method, Expectation Propagation, Variational Bayes) as well for large datasets (FITC, VFE, KISS-GP). A wide range of covariance, likelihood, mean and hyperprior functions allows to create very complex GP models.

Changes:

A major code restructuring effort did take place in the current release unifying certain inference functions and allowing more flexibility in covariance function composition. We also redesigned the whole derivative computation pipeline to strongly improve the overall runtime. We finally include grid-based covariance approximations natively.

More generic sparse approximation using Power EP

  • unified treatment of FITC approximation, variational approaches VFE and hybrids

  • inducing input optimisation for all (compositions of) covariance functions dropping the previous limitation to a few standard examples

  • infFITC is now covered by the more generic infGaussLik function

Approximate covariance object unifying sparse approximations, grid-based approximations and exact covariance computations

  • implementation in cov/apx, cov/apxGrid, cov/apxSparse

  • generic infGaussLik unifies infExact, infFITC and infGrid

  • generic infLaplace unifies infLaplace, infFITC_Laplace and infGrid_Laplace

Hiearchical structure of covariance functions

  • clear hierachical compositional implementation

  • no more code duplication as present in covSEiso and covSEard pairs

  • two mother covariance functions

    • covDot for dot-product-based covariances and

    • covMaha for Mahalanobis-distance-based covariances

  • a variety of modifiers: eye, iso, ard, proj, fact, vlen

  • more flexibility as more variants are available and possible

  • all covariance functions offer derivatives w.r.t. inputs

Faster derivative computations for mean and cov functions

  • switched from partial derivatives to directional derivatives

  • simpler and more concise interface of mean and cov functions

  • much faster marginal likelihood derivative computations

  • simpler and more compact code

New mean functions

  • new mean/meanWSPC (Weighted Sum of Projected Cosines or Random Kitchen Sink features) following a suggestion by William Herlands

  • new mean/meanWarp for constructing a new mean from an existing one by means of a warping function adapted from William Herlands

New optimizer

  • added a new minimize_minfunc, contributed by Truong X. Nghiem

New GLM link function

  • added the twice logistic link function util/glm_invlink_logistic2

Smaller fixes

  • two-fold speedup of util/elsympol used by covADD by Truong X. Nghiem

  • bugfix in util/logphi as reported by John Darby


Logo JMLR GPstuff 4.7

by avehtari - June 9, 2016, 17:45:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 45013 views, 11170 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2016-06-09 Version 4.7

Development and release branches available at https://github.com/gpstuff-dev/gpstuff

New features

  • Simple Bayesian Optimization demo

Improvements

  • Improved use of PSIS
  • More options added to gp_monotonic
  • Monotonicity now works for additive covariance functions with selected variables
  • Possibility to use gpcf_squared.m-covariance function with derivative observations/monotonicity
  • Default behaviour made more robust by changing default jitter from 1e-9 to 1e-6
  • LA-LOO uses the cavity method as the default (see Vehtari et al (2016). Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models. JMLR, accpeted for publication)
  • Selected variables -option works now better with monotonicity

Bugfixes

  • small error in derivative observation computation fixed
  • several minor bug fixes

Logo GPUML GPUs for kernel machines 4

by balajivasan - February 26, 2010, 18:12:46 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 7827 views, 1446 downloads, 1 subscription

About: GPUML is a library that provides a C/C++ and MATLAB interface for speeding up the computation of the weighted kernel summation and kernel matrix construction on GPU. These computations occur commonly in several machine learning algorithms like kernel density estimation, kernel regression, kernel PCA, etc.

Changes:

Initial Announcement on mloss.org.


Logo GradMC 2.00

by tur - April 14, 2014, 15:48:48 CET [ BibTeX Download ] 5080 views, 1572 downloads, 1 subscription

About: GradMC is an algorithm for MR motion artifact removal implemented in Matlab

Changes:

Added support for multi-rigid motion correction.


Logo Graph kernel based on iterative graph similarity and optimal assignments 2008-01-15

by mrupp - September 22, 2008, 13:42:28 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10720 views, 1890 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 1 vote)

About: Java package implementing a kernel for (molecular) graphs based on iterative graph similarity and optimal assignments.

Changes:

Initial Announcement on mloss.org.


Logo Graph Learning Package 0.1

by hiroto - May 4, 2009, 17:07:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10369 views, 1956 downloads, 0 subscriptions

About: This software is aimed at performing supervised/unsupervised learning on graph data, where each graph is represented as binary indicators of subgraph features.

Changes:

Initial Announcement on mloss.org.


Logo GraphDemo 1.0

by ule - November 27, 2007, 20:11:21 CET [ Project Homepage BibTeX Download ] 5947 views, 1592 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole Star1/2 StarEmpty Star
(based on 3 votes)

About: The GraphDemo provides Matlab GUIs to explore similarity graphs and their use in machine learning. It aims to highlight the behavior of different kinds of similarity graphs and to demonstrate their [...]

Changes:

Initial Announcement on mloss.org.


Showing Items 171-180 of 645 on page 18 of 65: First Previous 13 14 15 16 17 18 19 20 21 22 23 Next Last