All entries.
Showing Items 31-40 of 567 on page 4 of 57: Previous 1 2 3 4 5 6 7 8 9 Next Last

About: The glm-ie toolbox contains scalable estimation routines for GLMs (generalised linear models) and SLMs (sparse linear models) as well as an implementation of a scalable convex variational Bayesian inference relaxation. We designed the glm-ie package to be simple, generic and easily expansible. Most of the code is written in Matlab including some MEX files. The code is fully compatible to both Matlab 7.x and GNU Octave 3.2.x. Probabilistic classification, sparse linear modelling and logistic regression are covered in a common algorithmical framework allowing for both MAP estimation and approximate Bayesian inference.

Changes:

added factorial mean field inference as a third algorithm complementing expectation propagation and variational Bayes

generalised non-Gaussian potentials so that affine instead of linear functions of the latent variables can be used


Logo MLDemos 0.5.1

by basilio - March 2, 2013, 16:06:13 CET [ Project Homepage BibTeX Download ] 20226 views, 4772 downloads, 2 subscriptions

About: MLDemos is a user-friendly visualization interface for various machine learning algorithms for classification, regression, clustering, projection, dynamical systems, reward maximisation and reinforcement learning.

Changes:

New Visualization and Dataset Features Added 3D visualization of samples and classification, regression and maximization results Added Visualization panel with individual plots, correlations, density, etc. Added Editing tools to drag/magnet data, change class, increase or decrease dimensions of the dataset Added categorical dimensions (indexed dimensions with non-numerical values) Added Dataset Editing panel to swap, delete and rename dimensions, classes or categorical values Several bug-fixes for display, import/export of data, classification performance

New Algorithms and methodologies Added Projections to pre-process data (which can then be classified/regressed/clustered), with LDA, PCA, KernelPCA, ICA, CCA Added Grid-Search panel for batch-testing ranges of values for up to two parameters at a time Added One-vs-All multi-class classification for non-multi-class algorithms Trained models can now be kept and tested on new data (training on one dataset, testing on another) Added a dataset generator panel for standard toy datasets (e.g. swissroll, checkerboard,...) Added a number of clustering, regression and classification algorithms (FLAME, DBSCAN, LOWESS, CCA, KMEANS++, GP Classification, Random Forests) Added Save/Load Model option for GMMs and SVMs Added Growing Hierarchical Self Organizing Maps (original code by Michael Dittenbach) Added Automatic Relevance Determination for SVM with RBF kernel (Thanks to Ashwini Shukla!)


About: This toolbox provides functions for maximizing and minimizing submodular set functions, with applications to Bayesian experimental design, inference in Markov Random Fields, clustering and others.

Changes:
  • Modified specification of optional parameters (using sfo_opt)
  • Added sfo_ls_lazy for maximizing nonnegative submodular functions
  • Added sfo_fn_infogain, sfo_fn_lincomb, sfo_fn_invert, ...
  • Added additional documentation and more examples
  • Now Octave ready

Logo MDP Modular toolkit for Data Processing 3.3

by otizonaizit - October 4, 2012, 15:17:33 CET [ Project Homepage BibTeX Download ] 18123 views, 4657 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 3 votes)

About: MDP is a Python library of widely used data processing algorithms that can be combined according to a pipeline analogy to build more complex data processing software. The base of available algorithms includes signal processing methods (Principal Component Analysis, Independent Component Analysis, Slow Feature Analysis), manifold learning methods ([Hessian] Locally Linear Embedding), several classifiers, probabilistic methods (Factor Analysis, RBM), data pre-processing methods, and many others.

Changes:

What's new in version 3.3?

  • support sklearn versions up to 0.12
  • cleanly support reload
  • fail gracefully if pp server does not start
  • several bug-fixes and improvements

Logo r-cran-RWeka 0.4-10

by r-cran-robot - January 10, 2012, 00:00:00 CET [ Project Homepage BibTeX Download ] 21196 views, 4642 downloads, 1 subscription

About: R/Weka interface

Changes:

Fetched by r-cran-robot on 2012-02-01 00:00:11.330277


Logo JMLR MultiBoost 1.2.02

by busarobi - March 31, 2014, 16:13:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 26547 views, 4617 downloads, 1 subscription

About: MultiBoost is a multi-purpose boosting package implemented in C++. It is based on the multi-class/multi-task AdaBoost.MH algorithm [Schapire-Singer, 1999]. Basic base learners (stumps, trees, products, Haar filters for image processing) can be easily complemented by new data representations and the corresponding base learners, without interfering with the main boosting engine.

Changes:

Major changes :

  • The “early stopping” feature can now based on any metric output with the --outputinfo command line argument.

  • Early stopping now works with --slowresume command line argument.

Minor fixes:

  • More informative output when testing.

  • Various compilation glitch with recent clang (OsX/Linux).


Logo mldata-utils 0.5.0

by sonne - April 8, 2011, 10:02:44 CET [ Project Homepage BibTeX Download ] 21488 views, 4566 downloads, 1 subscription

About: Tools to convert datasets from various formats to various formats, performance measures and API functions to communicate with mldata.org

Changes:
  • Change task file format, such that data splits can have a variable number items and put into up to 256 categories of training/validation/test/not used/...
  • Various bugfixes.

Logo JMLR MSVMpack 1.5

by lauerfab - July 3, 2014, 16:02:49 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 13718 views, 4521 downloads, 2 subscriptions

About: MSVMpack is a Multi-class Support Vector Machine (M-SVM) package. It is dedicated to SVMs which can handle more than two classes without relying on decomposition methods and implements the four M-SVM models from the literature: Weston and Watkins M-SVM, Crammer and Singer M-SVM, Lee, Lin and Wahba M-SVM, and the M-SVM2 of Guermeur and Monfrini.

Changes:
  • Windows binaries are now included (by Emmanuel Didiot)
  • MSVMpack can now be compiled on Windows (by Emmanuel Didiot)
  • Fixed polynomial kernel
  • Minor bug fixes

Logo Apache Mahout 0.8

by gsingers - July 27, 2013, 15:52:32 CET [ Project Homepage BibTeX Download ] 16483 views, 4495 downloads, 2 subscriptions

About: Apache Mahout is an Apache Software Foundation project with the goal of creating both a community of users and a scalable, Java-based framework consisting of many machine learning algorithm [...]

Changes:

Apache Mahout 0.8 contains, amongst a variety of performance improvements and bug fixes, an implementation of Streaming K-Means, deeper Lucene/Solr integration and new scalable recommender algorithms. For a full description of the newest release, see http://mahout.apache.org/.


Logo JMLR GPstuff 4.5

by avehtari - July 22, 2014, 14:03:11 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 18196 views, 4434 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2014-07-22 Version 4.5

New features

  • Input dependent noise and signal variance.

    • Tolvanen, V., Jylänki, P. and Vehtari, A. (2014). Expectation Propagation for Nonstationary Heteroscedastic Gaussian Process Regression. In Proceedings of IEEE International Workshop on Machine Learning for Signal Processing, accepted for publication. Preprint http://arxiv.org/abs/1404.5443
  • Sparse stochastic variational inference model.

    • Hensman, J., Fusi, N. and Lawrence, N. D. (2013). Gaussian processes for big data. arXiv preprint http://arxiv.org/abs/1309.6835.
  • Option 'autoscale' in the gp_rnd.m to get split normal approximated samples from the posterior predictive distribution of the latent variable.

    • Geweke, J. (1989). Bayesian Inference in Econometric Models Using Monte Carlo Integration. Econometrica, 57(6):1317-1339.

    • Villani, M. and Larsson, R. (2006). The Multivariate Split Normal Distribution and Asymmetric Principal Components Analysis. Communications in Statistics - Theory and Methods, 35(6):1123-1140.

Improvements

  • New unit test environment using the Matlab built-in test framework (the old Xunit package is still also supported).
  • Precomputed demo results (including the figures) are now available in the folder tests/realValues.
  • New demos demonstrating new features etc.
    • demo_epinf, demonstrating the input dependent noise and signal variance model
    • demo_svi_regression, demo_svi_classification
    • demo_modelcomparison2, demo_survival_comparison

Several minor bugfixes


Showing Items 31-40 of 567 on page 4 of 57: Previous 1 2 3 4 5 6 7 8 9 Next Last