All entries.
Showing Items 301-310 of 536 on page 31 of 54: First Previous 26 27 28 29 30 31 32 33 34 35 36 Next Last

Logo GPUML GPUs for kernel machines 4

by balajivasan - February 26, 2010, 18:12:46 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 4826 views, 821 downloads, 1 subscription

About: GPUML is a library that provides a C/C++ and MATLAB interface for speeding up the computation of the weighted kernel summation and kernel matrix construction on GPU. These computations occur commonly in several machine learning algorithms like kernel density estimation, kernel regression, kernel PCA, etc.

Changes:

Initial Announcement on mloss.org.


Logo FWTN 1.0

by hn - March 25, 2010, 16:58:24 CET [ Project Homepage BibTeX Download ] 3681 views, 820 downloads, 1 subscription

About: Orthonormal wavelet transform for D dimensional tensors in L levels. Generic quadrature mirror filters and tensor sizes. Runtime is O(n), plain C, MEX-wrapper and demo provided.

Changes:

Initial Announcement on mloss.org.


Logo r-cran-RSNNS 0.4-3

by r-cran-robot - January 10, 2012, 00:00:00 CET [ Project Homepage BibTeX Download ] 4812 views, 818 downloads, 0 subscriptions

About: Neural Networks in R using the Stuttgart Neural Network Simulator (SNNS)

Changes:

Fetched by r-cran-robot on 2012-02-01 00:00:11.194183


Logo Oger 1.1.3

by dvrstrae - August 13, 2012, 14:55:41 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2188 views, 817 downloads, 1 subscription

About: The OrGanic Environment for Reservoir computing (Oger) toolbox is a Python toolbox for rapidly building, training and evaluating modular learning architectures on large datasets.

Changes:

Initial Announcement on mloss.org.


Logo libstb 1.8

by wbuntine - April 24, 2014, 09:02:17 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 4213 views, 805 downloads, 1 subscription

About: Generalised Stirling Numbers for Pitman-Yor Processes: this library provides ways of computing generalised 2nd-order Stirling numbers for Pitman-Yor and Dirichlet processes. Included is a tester and parameter optimiser. This accompanies Buntine and Hutter's article: http://arxiv.org/abs/1007.0296, and a series of papers by Buntine and students at NICTA and ANU.

Changes:

Moved repository to GitHub, and added thread support to use the main table lookups in multi-threaded code.


Logo SnOB beta

by risi - October 5, 2008, 21:39:18 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3771 views, 803 downloads, 1 subscription

About: SnOB is a C++ library implementing fast Fourier transforms on the symmetric group (group of permutations). Such Fourier transforms are used by some ranking and identity management algorithms, as [...]

Changes:

Initial Announcement on mloss.org.


Logo r-cran-caretLSF 1.25

by r-cran-robot - December 3, 2008, 00:00:00 CET [ Project Homepage BibTeX Download ] 2891 views, 802 downloads, 1 subscription

About: Classification and Regression Training LSF Style: Augment some caret functions for parallel processing

Changes:

Initial Announcement on mloss.org.


Logo HDDM 0.5

by Wiecki - April 24, 2013, 02:53:07 CET [ Project Homepage BibTeX Download ] 3020 views, 801 downloads, 1 subscription

About: HDDM is a python toolbox for hierarchical Bayesian parameter estimation of the Drift Diffusion Model (via PyMC). Drift Diffusion Models are used widely in psychology and cognitive neuroscience to study decision making.

Changes:
  • New and improved HDDM model with the following changes:
    • Priors: by default model will use informative priors (see http://ski.clps.brown.edu/hddm_docs/methods.html#hierarchical-drift-diffusion-models-used-in-hddm) If you want uninformative priors, set informative=False.
    • Sampling: This model uses slice sampling which leads to faster convergence while being slower to generate an individual sample. In our experiments, burnin of 20 is often good enough.
    • Inter-trial variablity parameters are only estimated at the group level, not for individual subjects.
    • The old model has been renamed to HDDMTransformed.
    • HDDMRegression and HDDMStimCoding are also using this model.
  • HDDMRegression takes patsy model specification strings. See http://ski.clps.brown.edu/hddm_docs/howto.html#estimate-a-regression-model and http://ski.clps.brown.edu/hddm_docs/tutorial_regression_stimcoding.html#chap-tutorial-hddm-regression
  • Improved online documentation at http://ski.clps.brown.edu/hddm_docs
  • A new HDDM demo at http://ski.clps.brown.edu/hddm_docs/demo.html
  • Ratcliff's quantile optimization method for single subjects and groups using the .optimize() method
  • Maximum likelihood optimization.
  • Many bugfixes and better test coverage.
  • hddm_fit.py command line utility is depracated.

Logo OXlearn 1.0

by gwestermann - January 11, 2010, 11:48:26 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3897 views, 800 downloads, 1 subscription

About: OXlearn is a free neural network simulation software that enables you to build, train, test and analyse connectionist neural network models. Because OXlearn is implemented as a Matlab toolbox you can run it on all operation systems (Windows, Linux, MAC, etc.), and there is a compiled version for XP.

Changes:

Initial Announcement on mloss.org.


Logo r-cran-randomSurvivalForest 3.6.3

by r-cran-robot - May 21, 2010, 00:00:00 CET [ Project Homepage BibTeX Download ] 3841 views, 794 downloads, 1 subscription

About: Random Survival Forests

Changes:

Fetched by r-cran-robot on 2013-03-01 00:00:08.083405


Showing Items 301-310 of 536 on page 31 of 54: First Previous 26 27 28 29 30 31 32 33 34 35 36 Next Last