All entries.
Showing Items 291-300 of 589 on page 30 of 59: First Previous 25 26 27 28 29 30 31 32 33 34 35 Next Last

Logo Caffe 0.9999

by sergeyk - August 9, 2014, 01:57:58 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6741 views, 1119 downloads, 2 subscriptions

About: Caffe aims to provide computer vision scientists with a clean, modifiable implementation of state-of-the-art deep learning algorithms. We believe that Caffe is the fastest available GPU CNN implementation. Caffe also provides seamless switching between CPU and GPU, which allows one to train models with fast GPUs and then deploy them on non-GPU clusters. Even in CPU mode, computing predictions on an image takes only 20 ms (in batch mode).

Changes:

LOTS of stuff: https://github.com/BVLC/caffe/releases/tag/v0.9999


Logo chestnut Machine Learning Suite 0.1.1

by damianeads - October 7, 2008, 13:04:19 CET [ Project Homepage BibTeX Download ] 4843 views, 1115 downloads, 1 subscription

About: The Chestnut Machine Learning Library is a suite of machine learning algorithms written in Python with some code written in C for efficiency. Most algorithms are called with a simple, functional API [...]

Changes:

Initial Announcement on mloss.org.


Logo Ohmm 0.02

by hillbig - May 21, 2009, 10:07:53 CET [ Project Homepage BibTeX Download ] 3985 views, 1111 downloads, 1 subscription

About: Ohmm is a library for learning hidden Markov models by using Online EM algorithm. This library is specialized for large scale data; e.g. 1 million words. The output includes parameters, and estimation results.

Changes:

Initial Announcement on mloss.org.


Logo Kernel Multiple Logistic Regression 1.0

by mseeger - November 10, 2007, 22:16:50 CET [ Project Homepage BibTeX Download ] 4935 views, 1110 downloads, 0 subscriptions

About: Efficient implementation of penalized multiple logistic regression (aka multi-class) with Mercer kernels, aka MAP approximation to the multi-class Gaussian process model. This includes [...]

Changes:

Initial Announcement on mloss.org.


Logo HDDM 0.5

by Wiecki - April 24, 2013, 02:53:07 CET [ Project Homepage BibTeX Download ] 4376 views, 1108 downloads, 1 subscription

About: HDDM is a python toolbox for hierarchical Bayesian parameter estimation of the Drift Diffusion Model (via PyMC). Drift Diffusion Models are used widely in psychology and cognitive neuroscience to study decision making.

Changes:
  • New and improved HDDM model with the following changes:
    • Priors: by default model will use informative priors (see http://ski.clps.brown.edu/hddm_docs/methods.html#hierarchical-drift-diffusion-models-used-in-hddm) If you want uninformative priors, set informative=False.
    • Sampling: This model uses slice sampling which leads to faster convergence while being slower to generate an individual sample. In our experiments, burnin of 20 is often good enough.
    • Inter-trial variablity parameters are only estimated at the group level, not for individual subjects.
    • The old model has been renamed to HDDMTransformed.
    • HDDMRegression and HDDMStimCoding are also using this model.
  • HDDMRegression takes patsy model specification strings. See http://ski.clps.brown.edu/hddm_docs/howto.html#estimate-a-regression-model and http://ski.clps.brown.edu/hddm_docs/tutorial_regression_stimcoding.html#chap-tutorial-hddm-regression
  • Improved online documentation at http://ski.clps.brown.edu/hddm_docs
  • A new HDDM demo at http://ski.clps.brown.edu/hddm_docs/demo.html
  • Ratcliff's quantile optimization method for single subjects and groups using the .optimize() method
  • Maximum likelihood optimization.
  • Many bugfixes and better test coverage.
  • hddm_fit.py command line utility is depracated.

Logo Experiment Databases for Machine Learning 0.1

by JoaquinVanschoren - October 7, 2008, 18:06:55 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6852 views, 1108 downloads, 1 subscription

About: Experiment Databases for Machine Learning is a large public database of machine learning experiments as well as a framework for producing similar databases for specific goals. It provides a way to [...]

Changes:

Initial Announcement on mloss.org.


About: A Java library to create, process and manage mixtures of exponential families.

Changes:

Initial Announcement on mloss.org.


About: The High Dimensional Discriminant Analysis (HDDA) toolbox contains an efficient supervised classifier for high-dimensional data. This classifier is based on Gaussian models adapted for [...]

Changes:

Initial Announcement on mloss.org.


Logo Tekkotsu 4.0

by touretzky - December 5, 2007, 10:28:02 CET [ Project Homepage BibTeX Download ] 4888 views, 1103 downloads, 0 subscriptions

About: Tekkotsu is a high-level framework for robot programming that provides primitives for perception, manipulation, navigation, and control. It supports a variety of robot platforms.

Changes:

Initial Announcement on mloss.org.


Logo cbMDS Correlation Based Multi Dimensional Scaling 1.2

by emstrick - July 27, 2013, 14:35:36 CET [ BibTeX BibTeX for corresponding Paper Download ] 4579 views, 1100 downloads, 1 subscription

About: The aim is to embed a given data relationship matrix into a low-dimensional Euclidean space such that the point distances / distance ranks correlate best with the original input relationships. Input relationships may be given as (sparse) (asymmetric) distance, dissimilarity, or (negative!) score matrices. Input-output relations are modeled as low-conditioned. (Weighted) Pearson and soft Spearman rank correlation, and unweighted soft Kendall correlation are supported correlation measures for input/output object neighborhood relationships.

Changes:
  • Initial release (Ver 1.0): Weighted Pearson and correlation and soft Spearman rank correlation, Tue Dec 4 16:14:51 CET 2012

  • Ver 1.1 Added soft Kendall correlation, Fri Mar 8 08:41:09 CET 2013

  • Ver 1.2 Added reconstruction of sparse relationship matrices, Fri Jul 26 16:58:37 CEST 2013


Showing Items 291-300 of 589 on page 30 of 59: First Previous 25 26 27 28 29 30 31 32 33 34 35 Next Last