Projects that are tagged with gaussian processes.


Logo pyGPs 1.2

by mn - July 17, 2014, 10:28:55 CET [ Project Homepage BibTeX Download ] 1733 views, 422 downloads, 2 subscriptions

About: pyGPs is a Python package for Gaussian process (GP) regression and classification for machine learning.

Changes:

Changelog pyGPs v1.2

June 30th 2014

structural updates:

  • input target now can either be in 2-d array with size (n,1) or in 1-d array with size (n,)
  • setup.py updated
  • "import pyGPs" instead of "from pyGPs.Core import gp"
  • rename ".train()" to ".optimize()"
  • rename "Graph-stuff" to "graphExtension"
  • rename kernelOnGraph to "nodeKernels" and graphKernel to "graphKernels"
  • redundancy removed for model.setData(x,y)
  • rewrite "mean.proceed()" to "getMean()" and "getDerMatrix()"
  • rewrite "cov.proceed()" to "getCovMatrix()" and "getDerMatrix()"
  • rename cov.LIN to cov.Linear (to be consistent with mean.Linear)
  • rename module "valid" to "validation"
  • add graph dataset Mutag in python file. (.npz and .mat)
  • add graphUtil.nomalizeKernel()
  • fix number of iteration problem in graphKernels "PropagationKernel"
  • add unit testing for covariance, mean functions

bug fixes:

  • derivatives for cov.LINard
  • derivative of the scalar for cov.covScale
  • demo_GPR_FITC.py missing pyGPs.mean

July 8th 2014

structural updates:

  • add hyperparameter(signal variance s2) for linear covariance
  • add unit testing for inference,likelihood functions as well as models
  • NOT show(print) "maximum number of sweep warning in inference EP" any more
  • documentation updated

bug fixes:

  • typos in lik.Laplace
  • derivative in lik.Laplace

July 14th 2014

documentation updates:

  • online docs updated
  • API file updated

structural updates:

  • made private for methods that users don't need to call

Logo Kernel Adaptive Filtering Toolbox 1.4

by steven2358 - May 26, 2014, 18:24:23 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2757 views, 426 downloads, 1 subscription

About: A Matlab benchmarking toolbox for online and adaptive regression with kernels.

Changes:
  • Improvements and demo script for profiler
  • Initial version of documentation
  • Several new algorithms

Logo JMLR SHOGUN 3.2.0

by sonne - February 17, 2014, 20:31:36 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 82821 views, 11460 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarEmpty StarEmpty Star
(based on 6 votes)

About: The SHOGUN machine learning toolbox's focus is on large scale learning methods with focus on Support Vector Machines (SVM), providing interfaces to python, octave, matlab, r and the command line.

Changes:

This is mostly a bugfix release:

Features

  • Fully support python3 now
  • Add mini-batch k-means [Parijat Mazumdar]
  • Add k-means++ [Parijat Mazumdar]
  • Add sub-sequence string kernel [lambday]

Bugfixes

  • Compile fixes for upcoming swig3.0
  • Speedup for gaussian process' apply()
  • Improve unit / integration test checks
  • libbmrm uninitialized memory reads
  • libocas uninitialized memory reads
  • Octave 3.8 compile fixes [Orion Poplawski]
  • Fix java modular compile error [Bjoern Esser]

Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.4

by hn - November 11, 2013, 14:46:52 CET [ Project Homepage BibTeX Download ] 18249 views, 4357 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

Changes:
  • derivatives w.r.t. inducing points xu in infFITC, infFITC_Laplace, infFITC_EP so that one can treat the inducing points either as fixed given quantities or as additional hyperparameters
  • new GLM likelihood likExp for inter-arrival time modeling
  • new GLM likelihood likWeibull for extremal value regression
  • new GLM likelihood likGumbel for extremal value regression
  • new mean function meanPoly depending polynomially on the data
  • infExact can deal safely with the zero noise variance limit
  • support of GP warping through the new likelihood function likGaussWarp

Logo PILCO policy search framework 0.9

by marc - September 27, 2013, 12:45:12 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1641 views, 328 downloads, 1 subscription

About: Data-efficient policy search framework using probabilistic Gaussian process models

Changes:

Initial Announcement on mloss.org.


Logo GPgrid toolkit for fast GP analysis on grid input 0.1

by ejg20 - September 16, 2013, 18:01:16 CET [ BibTeX Download ] 703 views, 264 downloads, 1 subscription

About: GPgrid toolkit for fast GP analysis on grid input

Changes:

Initial Announcement on mloss.org.


About: Fast Multidimensional GP Inference using Projected Additive Approximation

Changes:

Initial Announcement on mloss.org.


Logo Cognitive Foundry 3.3.3

by Baz - May 21, 2013, 05:59:37 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 15794 views, 2510 downloads, 2 subscriptions

About: The Cognitive Foundry is a modular Java software library of machine learning components and algorithms designed for research and applications.

Changes:
  • General:
    • Made code able to compile under both Java 1.6 and 1.7. This required removing some potentially unsafe methods that used varargs with generics.
    • Upgraded XStream dependency to 1.4.4.
    • Improved support for regression algorithms in learning.
    • Added general-purpose adapters to make it easier to compose learning algorithms and adapt their input or output.
  • Common Core:
    • Added isSparse, toArray, dotDivide, and dotDivideEquals methods for Vector and Matrix.
    • Added scaledPlus, scaledPlusEquals, scaledMinus, and scaledMinusEquals to Ring (and thus Vector and Matrix) for potentially faster such operations.
    • Fixed issue where matrix and dense vector equals was not checking for equal dimensionality.
    • Added transform, transformEquals, tranformNonZeros, and transformNonZerosEquals to Vector.
    • Made LogNumber into a signed version of a log number and moved the prior unsigned implementation into UnsignedLogNumber.
    • Added EuclideanRing interface that provides methods for times, timesEquals, divide, and divideEquals. Also added Field interface that provides methods for inverse and inverseEquals. These interfaces are now implemented by the appropriate number classes such as ComplexNumber, MutableInteger, MutableLong, MutableDouble, LogNumber, and UnsignedLogNumber.
    • Added interface for Indexer and DefaultIndexer implementation for creating a zero-based indexing of values.
    • Added interfaces for MatrixFactoryContainer and DivergenceFunctionContainer.
    • Added ReversibleEvaluator, which various identity functions implement as well as a new utility class ForwardReverseEvaluatorPair to create a reversible evaluator from a pair of other evaluators.
    • Added method to create an ArrayList from a pair of values in CollectionUtil.
    • ArgumentChecker now properly throws assertion errors for NaN values. Also added checks for long types.
    • Fixed handling of Infinity in subtraction for LogMath.
    • Fixed issue with angle method that would cause a NaN if cosine had a rounding error.
    • Added new createMatrix methods to MatrixFactory that initializes the Matrix with the given value.
    • Added copy, reverse, and isEmpty methods for several array types to ArrayUtil.
    • Added utility methods for creating a HashMap, LinkedHashMap, HashSet, or LinkedHashSet with an expected size to CollectionUtil.
    • Added getFirst and getLast methods for List types to CollectionUtil.
    • Removed some calls to System.out and Exception.printStackTrace.
  • Common Data:
    • Added create method for IdentityDataConverter.
    • ReversibleDataConverter now is an extension of ReversibleEvaluator.
  • Learning Core:
    • Added general learner transformation capability to make it easier to adapt and compose algorithms. InputOutputTransformedBatchLearner provides this capability for supervised learning algorithms by composing together a triplet. CompositeBatchLearnerPair does it for a pair of algorithms.
    • Added a constant and identity learners.
    • Added Chebyshev, Identity, and Minkowski distance metrics.
    • Added methods to DatasetUtil to get the output values for a dataset and to compute the sum of weights.
    • Made generics more permissive for supervised cost functions.
    • Added ClusterDistanceEvaluator for taking a clustering that encodes the distance from an input value to all clusters and returns the result as a vector.
    • Fixed potential round-off issue in decision tree splitter.
    • Added random subspace technique, implemented in RandomSubspace.
    • Separated functionality from LinearFunction into IdentityScalarFunction. LinearFunction by default is the same, but has parameters that can change the slope and offset of the function.
    • Default squashing function for GeneralizedLinearModel and DifferentiableGeneralizedLinearModel is now a linear function instead of an atan function.
    • Added a weighted estimator for the Poisson distribution.
    • Added Regressor interface for evaluators that are the output of (single-output) regression learning algorithms. Existing such evaluators have been updated to implement this interface.
    • Added support for regression ensembles including additive and averaging ensembles with and without weights. Added a learner for regression bagging in BaggingRegressionLearner.
    • Added a simple univariate regression class in UnivariateLinearRegression.
    • MultivariateDecorrelator now is a VectorInputEvaluator and VectorOutputEvaluator.
    • Added bias term to PrimalEstimatedSubGradient.
  • Text Core:
    • Fixed issue with the start position for tokens from LetterNumberTokenizer being off by one except for the first one.

Logo MLDemos 0.5.1

by basilio - March 2, 2013, 16:06:13 CET [ Project Homepage BibTeX Download ] 17621 views, 4198 downloads, 2 subscriptions

About: MLDemos is a user-friendly visualization interface for various machine learning algorithms for classification, regression, clustering, projection, dynamical systems, reward maximisation and reinforcement learning.

Changes:

New Visualization and Dataset Features Added 3D visualization of samples and classification, regression and maximization results Added Visualization panel with individual plots, correlations, density, etc. Added Editing tools to drag/magnet data, change class, increase or decrease dimensions of the dataset Added categorical dimensions (indexed dimensions with non-numerical values) Added Dataset Editing panel to swap, delete and rename dimensions, classes or categorical values Several bug-fixes for display, import/export of data, classification performance

New Algorithms and methodologies Added Projections to pre-process data (which can then be classified/regressed/clustered), with LDA, PCA, KernelPCA, ICA, CCA Added Grid-Search panel for batch-testing ranges of values for up to two parameters at a time Added One-vs-All multi-class classification for non-multi-class algorithms Trained models can now be kept and tested on new data (training on one dataset, testing on another) Added a dataset generator panel for standard toy datasets (e.g. swissroll, checkerboard,...) Added a number of clustering, regression and classification algorithms (FLAME, DBSCAN, LOWESS, CCA, KMEANS++, GP Classification, Random Forests) Added Save/Load Model option for GMMs and SVMs Added Growing Hierarchical Self Organizing Maps (original code by Michael Dittenbach) Added Automatic Relevance Determination for SVM with RBF kernel (Thanks to Ashwini Shukla!)


Logo BRML toolbox 070711

by DavidBarber - July 17, 2011, 19:30:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 57712 views, 3523 downloads, 1 subscription

About: Bayesian Reasoning and Machine Learning toolbox

Changes:

Fixed some small bugs and updated some demos.


About: Matlab implementation of variational gaussian approximate inference for Bayesian Generalized Linear Models.

Changes:

Code restructure and bug fix.