All entries.
Showing Items 1-10 of 536 on page 1 of 54: 1 2 3 4 5 6 Next Last

Logo pSpectralClustering 1.1

by tbuehler - July 30, 2014, 19:44:52 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 4309 views, 945 downloads, 1 subscription

About: A generalized version of spectral clustering using the graph p-Laplacian.

Changes:
  • fixed compatibility issue with Matlab R2013a+
  • several internal optimizations

Logo Harry 0.3

by konrad - July 30, 2014, 16:15:26 CET [ Project Homepage BibTeX Download ] 1196 views, 249 downloads, 1 subscription

About: A Tool for Measuring String Similarity

Changes:

This new release implements 21 similarity measures for strings (Option -M). It supports splitting the computation of large similarity matrices into blocks and thus allows comparing large sets of strings (Option -s as well as -x and -y). The command-line interface has been improved and several minor bugs have been fixed.


Logo QSMM 1.16

by olegvol - July 29, 2014, 19:37:31 CET [ Project Homepage BibTeX Download ] 96 views, 10 downloads, 1 subscription

About: The implementation of adaptive probabilistic mappings.

Changes:

Initial Announcement on mloss.org.


Logo JMLR MLPACK 1.0.9

by rcurtin - July 28, 2014, 20:52:10 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 29964 views, 6019 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 1 vote)

About: A scalable, fast C++ machine learning library, with emphasis on usability.

Changes:
  • GMM initialization is now safer and provides a working GMM when constructed with only the dimensionality and number of Gaussians (#314).
  • Check for division by 0 in Forward-Backward Algorithm in HMMs (#314).
  • Fix MaxVarianceNewCluster (used when re-initializing clusters for k-means) (#314).
  • Fixed implementation of Viterbi algorithm in HMM::Predict() (#316).
  • Significant speedups for dual-tree algorithms using the cover tree (#243, #329) including a faster implementation of FastMKS.
  • Fix for LRSDP optimizer so that it compiles and can be used (#325).
  • CF (collaborative filtering) now expects users and items to be zero-indexed, not one-indexed (#324).
  • CF::GetRecommendations() API change: now requires the number of recommendations as the first parameter. The number of users in the local neighborhood should be specified with CF::NumUsersForSimilarity().
  • Removed incorrect PeriodicHRectBound (#30).
  • Refactor LRSDP into LRSDP class and standalone function to be optimized (#318).
  • Fix for centering in kernel PCA (#355).
  • Added simulated annealing (SA) optimizer, contributed by Zhihao Lou.
  • HMMs now support initial state probabilities; these can be set in the constructor, trained, or set manually with HMM::Initial() (#315).
  • Added Nyström method for kernel matrix approximation by Marcus Edel.
  • Kernel PCA now supports using Nyström method for approximation.
  • Ball trees now work with dual-tree algorithms, via the BallBound<> bound structure (#320); fixed by Yash Vadalia.
  • The NMF class is now AMF<>, and supports far more types of factorizations, by Sumedh Ghaisas.
  • A QUIC-SVD implementation has returned, written by Siddharth Agrawal and based on older code from Mudit Gupta.
  • Added perceptron and decision stump by Udit Saxena (these are weak learners for an eventual AdaBoost class).
  • Sparse autoencoder added by Siddharth Agrawal.

Logo Boosted Decision Trees and Lists 1.0.4

by melamed - July 25, 2014, 23:08:32 CET [ BibTeX Download ] 2114 views, 656 downloads, 3 subscriptions

About: Boosting algorithms for classification and regression, with many variations. Features include: Scalable and robust; Easily customizable loss functions; One-shot training for an entire regularization path; Continuous checkpointing; much more

Changes:
  • added ElasticNets as a regularization option
  • fixed some segfaults, memory leaks, and out-of-range errors, which were creeping in in some corner cases
  • added a couple of I/O optimizations

Logo JMLR JKernelMachines 2.4

by dpicard - July 24, 2014, 13:51:44 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10130 views, 2652 downloads, 2 subscriptions

Rating Whole StarWhole Star1/2 StarEmpty StarEmpty Star
(based on 1 vote)

About: machine learning library in java for easy development of new kernels

Changes:

Version 2.4

  • Added a simple GUI to rapidly test some algorithms
  • New Active Learning package
  • New algorithms (LLSVM, KMeans)
  • New Kernels (Polynomials, component wise)
  • Many bugfixes and improvements to existing algorithms
  • Many optimization

The number of changes in this version is massive, test it! Don't forget to report any regression.


Logo Optunity 0.2.0

by claesenm - July 24, 2014, 10:07:54 CET [ Project Homepage BibTeX Download ] 180 views, 36 downloads, 1 subscription

About: Optunity is a library containing various optimizers for hyperparameter tuning. Hyperparameter tuning is a recurrent problem in many machine learning tasks, both supervised and unsupervised.This package provides several distinct approaches to solve such problems including some helpful facilities such as cross-validation and a plethora of score functions.

Changes:

Initial Announcement on mloss.org.


Logo JMLR GPstuff 4.5

by avehtari - July 22, 2014, 14:03:11 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10900 views, 2938 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2014-07-22 Version 4.5

New features

  • Input dependent noise and signal variance.

    • Tolvanen, V., Jylänki, P. and Vehtari, A. (2014). Expectation Propagation for Nonstationary Heteroscedastic Gaussian Process Regression. In Proceedings of IEEE International Workshop on Machine Learning for Signal Processing, accepted for publication. Preprint http://arxiv.org/abs/1404.5443
  • Sparse stochastic variational inference model.

    • Hensman, J., Fusi, N. and Lawrence, N. D. (2013). Gaussian processes for big data. arXiv preprint http://arxiv.org/abs/1309.6835.
  • Option 'autoscale' in the gp_rnd.m to get split normal approximated samples from the posterior predictive distribution of the latent variable.

    • Geweke, J. (1989). Bayesian Inference in Econometric Models Using Monte Carlo Integration. Econometrica, 57(6):1317-1339.

    • Villani, M. and Larsson, R. (2006). The Multivariate Split Normal Distribution and Asymmetric Principal Components Analysis. Communications in Statistics - Theory and Methods, 35(6):1123-1140.

Improvements

  • New unit test environment using the Matlab built-in test framework (the old Xunit package is still also supported).
  • Precomputed demo results (including the figures) are now available in the folder tests/realValues.
  • New demos demonstrating new features etc.
    • demo_epinf, demonstrating the input dependent noise and signal variance model
    • demo_svi_regression, demo_svi_classification
    • demo_modelcomparison2, demo_survival_comparison

Several minor bugfixes


Logo JMLR Waffles 2014-07-05

by mgashler - July 20, 2014, 04:53:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 22134 views, 6708 downloads, 2 subscriptions

About: Script-friendly command-line tools for machine learning and data mining tasks. (The command-line tools wrap functionality from a public domain C++ class library.)

Changes:

Added support for CUDA GPU-parallelized neural network layers, and several other new features. Full list of changes at http://waffles.sourceforge.net/docs/changelog.html


Logo pyGPs 1.2

by mn - July 17, 2014, 10:28:55 CET [ Project Homepage BibTeX Download ] 1507 views, 372 downloads, 2 subscriptions

About: pyGPs is a Python package for Gaussian process (GP) regression and classification for machine learning.

Changes:

Changelog pyGPs v1.2

June 30th 2014

structural updates:

  • input target now can either be in 2-d array with size (n,1) or in 1-d array with size (n,)
  • setup.py updated
  • "import pyGPs" instead of "from pyGPs.Core import gp"
  • rename ".train()" to ".optimize()"
  • rename "Graph-stuff" to "graphExtension"
  • rename kernelOnGraph to "nodeKernels" and graphKernel to "graphKernels"
  • redundancy removed for model.setData(x,y)
  • rewrite "mean.proceed()" to "getMean()" and "getDerMatrix()"
  • rewrite "cov.proceed()" to "getCovMatrix()" and "getDerMatrix()"
  • rename cov.LIN to cov.Linear (to be consistent with mean.Linear)
  • rename module "valid" to "validation"
  • add graph dataset Mutag in python file. (.npz and .mat)
  • add graphUtil.nomalizeKernel()
  • fix number of iteration problem in graphKernels "PropagationKernel"
  • add unit testing for covariance, mean functions

bug fixes:

  • derivatives for cov.LINard
  • derivative of the scalar for cov.covScale
  • demo_GPR_FITC.py missing pyGPs.mean

July 8th 2014

structural updates:

  • add hyperparameter(signal variance s2) for linear covariance
  • add unit testing for inference,likelihood functions as well as models
  • NOT show(print) "maximum number of sweep warning in inference EP" any more
  • documentation updated

bug fixes:

  • typos in lik.Laplace
  • derivative in lik.Laplace

July 14th 2014

documentation updates:

  • online docs updated
  • API file updated

structural updates:

  • made private for methods that users don't need to call

Showing Items 1-10 of 536 on page 1 of 54: 1 2 3 4 5 6 Next Last