All entries.
Showing Items 151-160 of 623 on page 16 of 63: First Previous 11 12 13 14 15 16 17 18 19 20 21 Next Last

Logo libstb 1.8

by wbuntine - April 24, 2014, 09:02:17 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10166 views, 2055 downloads, 1 subscription

About: Generalised Stirling Numbers for Pitman-Yor Processes: this library provides ways of computing generalised 2nd-order Stirling numbers for Pitman-Yor and Dirichlet processes. Included is a tester and parameter optimiser. This accompanies Buntine and Hutter's article: http://arxiv.org/abs/1007.0296, and a series of papers by Buntine and students at NICTA and ANU.

Changes:

Moved repository to GitHub, and added thread support to use the main table lookups in multi-threaded code.


Logo Easysvm 0.3

by gxr - June 25, 2009, 18:33:04 CET [ Project Homepage BibTeX Download ] 10115 views, 2026 downloads, 1 subscription

About: The Easysvm package provides a set of tools based on the Shogun toolbox allowing to train and test SVMs in a simple way.

Changes:

Fixes for shogun 0.7.3.


About: TinyOS is a small operating for small (wireless) sensors. LEGO MINDSTORMS NXT is a platform for embedded systems experimentation: The combination of NXT and TinyOS is NXTMOTE.

Changes:

Initial Announcement on mloss.org.


Logo Aleph 0.6

by jiria - January 12, 2009, 20:52:12 CET [ Project Homepage BibTeX Download ] 9981 views, 2718 downloads, 1 subscription

About: Aleph is both a multi-platform machine learning framework aimed at simplicity and performance, and a library of selected state-of-the-art algorithms.

Changes:

Initial Announcement on mloss.org.


Logo GPDT Gradient Projection Decomposition Technique 1.01

by sezaza - December 21, 2007, 20:10:43 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 9969 views, 1861 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 1 vote)

About: This is a C++ software designed to train large-scale SVMs for binary classification. The algorithm is also implemented in parallel (**PGPDT**) for distributed memory, strictly coupled multiprocessor [...]

Changes:

Initial Announcement on mloss.org.


Logo libcmaes 0.9.5

by beniz - March 9, 2015, 09:05:22 CET [ Project Homepage BibTeX Download ] 9938 views, 1917 downloads, 3 subscriptions

About: Libcmaes is a multithreaded C++11 library (with Python bindings) for high performance blackbox stochastic optimization of difficult, possibly non-linear and non-convex functions, using the CMA-ES algorithm for Covariance Matrix Adaptation Evolution Strategy. Libcmaes is useful to minimize / maximize any function, without information regarding gradient or derivability.

Changes:

This is a major release, with several novelties, improvements and fixes, among which:

  • step-size two-point adaptaion scheme for improved performances in some settings, ref #88

  • important bug fixes to the ACM surrogate scheme, ref #57, #106

  • simple high-level workflow under Python, ref #116

  • improved performances in high dimensions, ref #97

  • improved profile likelihood and contour computations, including under geno/pheno transforms, ref #30, #31, #48

  • elitist mechanism for forcing best solutions during evolution, ref 103

  • new legacy plotting function, ref #110

  • optional initial function value, ref #100

  • improved C++ API, ref #89

  • Python bindings support with Anaconda, ref #111

  • configure script now tries to detect numpy when building Python bindings, ref #113

  • Python bindings now have embedded documentation, ref #114

  • support for Travis continuous integration, ref #122

  • lower resolution random seed initialization


Logo Hivemall 0.3

by myui - March 13, 2015, 17:08:22 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 9936 views, 1728 downloads, 3 subscriptions

About: Hivemall is a scalable machine learning library running on Hive/Hadoop.

Changes:
  • Supported Matrix Factorization
  • Added a support for TF-IDF computation
  • Supported AdaGrad/AdaDelta
  • Supported AdaGradRDA classification
  • Added normalization scheme

Logo bob 1.2.2

by anjos - October 28, 2013, 14:37:36 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 9913 views, 2175 downloads, 1 subscription

About: Bob is a free signal-processing and machine learning toolbox originally developed by the Biometrics group at Idiap Research Institute, in Switzerland.

Changes:

Bob 1.2.0 comes about 1 year after we released Bob 1.0.0. This new release comes with a big set of new features and lots of changes under the hood to make your experiments run even smoother. Some statistics:

Diff URL: https://github.com/idiap/bob/compare/v1.1.4...HEAD Commits: 629 Files changed: 954 Contributors: 7

Here is a quick list of things you should pay attention for while integrating your satellite packages against Bob 1.2.x:

  • The LBP module had its API changed look at the online docs for more details
  • LLRTrainer has been renamed to CGLogRegTrainer
  • The order in which you pass data to CGLogRegTrainer has been inverted (negatives now go first)
  • For C++ bindings, includes are in bob/python instead of bob/core/python
  • All specialized Bob exceptions are gone, if you were catching them, most have been cast into std::runtime_error's

For a detailed list of changes and additions, please look at our Changelog page for this release and minor updates:

https://github.com/idiap/bob/wiki/Changelog-from-1.1.4-to-1.2 https://github.com/idiap/bob/wiki/Changelog-from-1.2.0-to-1.2.1 https://github.com/idiap/bob/wiki/Changelog-from-1.2.1-to-1.2.2


About: This local and parallel computation toolbox is the Octave and Matlab implementation of several localized Gaussian process regression methods: the domain decomposition method (Park et al., 2011, DDM), partial independent conditional (Snelson and Ghahramani, 2007, PIC), localized probabilistic regression (Urtasun and Darrell, 2008, LPR), and bagging for Gaussian process regression (Chen and Ren, 2009, BGP). Most of the localized regression methods can be applied for general machine learning problems although DDM is only applicable for spatial datasets. In addition, the GPLP provides two parallel computation versions of the domain decomposition method. The easiness of being parallelized is one of the advantages of the localized regression, and the two parallel implementations will provide a good guidance about how to materialize this advantage as software.

Changes:

Initial Announcement on mloss.org.


Logo RLS2 MATLAB Toolbox 0.7

by posaune - March 31, 2010, 20:37:11 CET [ Project Homepage BibTeX Download ] 9902 views, 2085 downloads, 1 subscription

About: RLS2 is an instance of multiple kernel learning algorithm to simultaneously learn a regularized predictor and the kernel function. RLS2LIN is a version of RLS2 specialized to linear kernels on each feature. The package contains a set of scripts that implements RLS2 and RLS2LIN, together with a Graphic User Interface to load data, perform training, validation, and plot results.

Changes:
  • New kernel functions (rbfall, rbfsingle, polyall, polysingle)
  • Improved interface for pre-processing operations
  • The interface now allows to disable bias
  • Fixed bugs in parameter passing (thanks to Andrea Schirru)

Showing Items 151-160 of 623 on page 16 of 63: First Previous 11 12 13 14 15 16 17 18 19 20 21 Next Last