All entries.
Showing Items 71-80 of 537 on page 8 of 54: First Previous 3 4 5 6 7 8 9 10 11 12 13 Next Last

Logo JMLR JNCC2 1.11

by gcorani - January 1, 2009, 03:22:47 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 11985 views, 1434 downloads, 0 comments, 1 subscription

About: JNCC2 is the open-source implementation of the Naive Credal Classifier2 (NCC2), i.e., an extension of Naive Bayes towards imprecise probabilities, designed to deliver robust classifications even on [...]

Changes:

Initial Announcement on mloss.org.


Logo Malheur 0.5.4

by konrad - December 25, 2013, 13:20:31 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 11716 views, 2267 downloads, 1 subscription

About: Automatic Analysis of Malware Behavior using Machine Learning

Changes:

Support for new version of libarchive. Minor bug fixes.


Logo r-cran-klaR 0.6-8

by r-cran-robot - March 27, 2013, 00:00:00 CET [ Project Homepage BibTeX Download ] 11693 views, 2455 downloads, 1 subscription

About: Classification and visualization

Changes:

Fetched by r-cran-robot on 2013-04-01 00:00:05.722314


Logo r-cran-glmnet 1.9-3

by r-cran-robot - March 1, 2013, 00:00:00 CET [ Project Homepage BibTeX Download ] 11670 views, 2631 downloads, 1 subscription

About: Lasso and elastic-net regularized generalized linear models

Changes:

Fetched by r-cran-robot on 2013-04-01 00:00:05.081872


Logo JMLR GPstuff 4.5

by avehtari - July 22, 2014, 14:03:11 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 11630 views, 3082 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2014-07-22 Version 4.5

New features

  • Input dependent noise and signal variance.

    • Tolvanen, V., Jylänki, P. and Vehtari, A. (2014). Expectation Propagation for Nonstationary Heteroscedastic Gaussian Process Regression. In Proceedings of IEEE International Workshop on Machine Learning for Signal Processing, accepted for publication. Preprint http://arxiv.org/abs/1404.5443
  • Sparse stochastic variational inference model.

    • Hensman, J., Fusi, N. and Lawrence, N. D. (2013). Gaussian processes for big data. arXiv preprint http://arxiv.org/abs/1309.6835.
  • Option 'autoscale' in the gp_rnd.m to get split normal approximated samples from the posterior predictive distribution of the latent variable.

    • Geweke, J. (1989). Bayesian Inference in Econometric Models Using Monte Carlo Integration. Econometrica, 57(6):1317-1339.

    • Villani, M. and Larsson, R. (2006). The Multivariate Split Normal Distribution and Asymmetric Principal Components Analysis. Communications in Statistics - Theory and Methods, 35(6):1123-1140.

Improvements

  • New unit test environment using the Matlab built-in test framework (the old Xunit package is still also supported).
  • Precomputed demo results (including the figures) are now available in the folder tests/realValues.
  • New demos demonstrating new features etc.
    • demo_epinf, demonstrating the input dependent noise and signal variance model
    • demo_svi_regression, demo_svi_classification
    • demo_modelcomparison2, demo_survival_comparison

Several minor bugfixes


Logo Maja Machine Learning Framework 1.0

by jhm - September 13, 2011, 15:13:56 CET [ Project Homepage BibTeX Download ] 11558 views, 2366 downloads, 1 subscription

About: The Maja Machine Learning Framework (MMLF) is a general framework for problems in the domain of Reinforcement Learning (RL) written in python. It provides a set of RL related algorithms and a set of benchmark domains. Furthermore it is easily extensible and allows to automate benchmarking of different agents.

Changes:
  • Experiments can now be invoked from the command line
  • Experiments can now be "scripted"
  • MMLF Experimenter contains now basic module for statistical hypothesis testing
  • MMLF Explorer can now visualize the model that has been learned by an agent

Logo r-cran-rgenoud 5.7-8.1

by r-cran-robot - June 3, 2012, 00:00:00 CET [ Project Homepage BibTeX Download ] 11408 views, 2450 downloads, 1 subscription

About: R version of GENetic Optimization Using Derivatives

Changes:

Fetched by r-cran-robot on 2013-04-01 00:00:08.101900


Logo JMLR SSA Toolbox 1.3

by paulbuenau - January 24, 2012, 15:51:02 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 11378 views, 3498 downloads, 1 subscription

About: The SSA Toolbox is an efficient, platform-independent, standalone implementation of the Stationary Subspace Analysis algorithm with a friendly graphical user interface and a bridge to Matlab. Stationary Subspace Analysis (SSA) is a general purpose algorithm for the explorative analysis of non-stationary data, i.e. data whose statistical properties change over time. SSA helps to detect, investigate and visualize temporal changes in complex high-dimensional data sets.

Changes:
  • Various bugfixes.

Logo jblas 1.1.1

by mikio - September 1, 2010, 13:53:51 CET [ Project Homepage BibTeX Download ] 11346 views, 2814 downloads, 1 subscription

Rating Whole StarWhole StarWhole Star1/2 StarEmpty Star
(based on 2 votes)

About: jblas is a fast linear algebra library for Java. jblas is based on BLAS and LAPACK, the de-facto industry standard for matrix computations, and uses state-of-the-art implementations like ATLAS for all its computational routines, making jBLAS very fast.

Changes:

Changes from 1.0:

  • Added singular value decomposition
  • Fixed bug with returning complex values
  • Many other minor improvements

Logo JMLR Surrogate Modeling Toolbox 7.0.2

by dgorissen - September 4, 2010, 07:48:59 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 11265 views, 3328 downloads, 1 subscription

About: The SUMO Toolbox is a Matlab toolbox that automatically builds accurate surrogate models (also known as metamodels or response surface models) of a given data source (e.g., simulation code, data set, script, ...) within the accuracy and time constraints set by the user. The toolbox minimizes the number of data points (which it selects automatically) since they are usually expensive.

Changes:

Incremental update, fixing some cosmetic issues, coincides with JMLR publication.


Showing Items 71-80 of 537 on page 8 of 54: First Previous 3 4 5 6 7 8 9 10 11 12 13 Next Last