All entries.
Showing Items 291-300 of 624 on page 30 of 63: First Previous 25 26 27 28 29 30 31 32 33 34 35 Next Last

Logo Dirichlet Forest LDA 0.1.1

by davidandrzej - July 16, 2009, 21:59:53 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6328 views, 1311 downloads, 1 subscription

About: This software implements the Dirichlet Forest (DF) Prior within the Latent Dirichlet Allocation (LDA) model. When combined with LDA, the Dirichlet Forest Prior allows the user to encode domain knowledge (must-links and cannot-links between words) into the prior on topic-word multinomials.

Changes:

Initial Announcement on mloss.org.


Logo arts 0.2

by sonne - May 25, 2009, 09:56:31 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6320 views, 1362 downloads, 1 subscription

About: ARTS is an accurate predictor for Transcription Start Sites (TSS).

Changes:

Initial Announcement on mloss.org.


Logo SVQP 2

by leonbottou - January 31, 2009, 14:22:04 CET [ Project Homepage BibTeX Download ] 6292 views, 2105 downloads, 0 subscriptions

About: SVQP1 and SVQP2 are QP solvers for training SVM.

Changes:

Initial Announcement on mloss.org.


Logo Sequin v1.1.0.0

by apitman - September 23, 2011, 11:47:53 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6236 views, 1506 downloads, 1 subscription

About: Sequin is an open source sequence mining library written in C#.

Changes:

Sequin v1.1.0.0 released


Logo PSVM 1.31

by mhex - July 29, 2010, 10:02:12 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6194 views, 1596 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 2 votes)

About: PSVM - Support vector classification, regression and feature extraction for non-square dyadic data, non-Mercer kernels.

Changes:

Initial Announcement on mloss.org.


Logo FlexCRFs 0.3

by pxhieu - May 10, 2008, 09:24:54 CET [ Project Homepage BibTeX Download ] 6179 views, 3177 downloads, 1 subscription

About: FlexCRFs is a conditional random field toolkit for segmenting and labeling sequence data written in C/C++ using STL library. It was implemented based on the theoretic model presented in (Lafferty et [...]

Changes:

Initial Announcement on mloss.org.


Logo CVXOPT 1.1

by jdahl - October 24, 2008, 21:37:16 CET [ Project Homepage BibTeX Download ] 6160 views, 1823 downloads, 0 comments, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 3 votes)

About: CVXOPT is a free software package for convex optimization based on the Python programming language. It can be used with the interactive Python interpreter, on the command line by executing Python [...]

Changes:

Initial Announcement on mloss.org.


Logo OpenCog pre-1.0

by ferrouswheel - January 11, 2009, 22:51:39 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6152 views, 2442 downloads, 1 subscription

About: OpenCog aims to provide research scientists and software developers with a common platform to build and share artificial intelligence programs. The long-term goal of OpenCog is acceleration of the [...]

Changes:

Initial Announcement on mloss.org.


Logo HDDM 0.5

by Wiecki - April 24, 2013, 02:53:07 CET [ Project Homepage BibTeX Download ] 6148 views, 1518 downloads, 1 subscription

About: HDDM is a python toolbox for hierarchical Bayesian parameter estimation of the Drift Diffusion Model (via PyMC). Drift Diffusion Models are used widely in psychology and cognitive neuroscience to study decision making.

Changes:
  • New and improved HDDM model with the following changes:
    • Priors: by default model will use informative priors (see http://ski.clps.brown.edu/hddm_docs/methods.html#hierarchical-drift-diffusion-models-used-in-hddm) If you want uninformative priors, set informative=False.
    • Sampling: This model uses slice sampling which leads to faster convergence while being slower to generate an individual sample. In our experiments, burnin of 20 is often good enough.
    • Inter-trial variablity parameters are only estimated at the group level, not for individual subjects.
    • The old model has been renamed to HDDMTransformed.
    • HDDMRegression and HDDMStimCoding are also using this model.
  • HDDMRegression takes patsy model specification strings. See http://ski.clps.brown.edu/hddm_docs/howto.html#estimate-a-regression-model and http://ski.clps.brown.edu/hddm_docs/tutorial_regression_stimcoding.html#chap-tutorial-hddm-regression
  • Improved online documentation at http://ski.clps.brown.edu/hddm_docs
  • A new HDDM demo at http://ski.clps.brown.edu/hddm_docs/demo.html
  • Ratcliff's quantile optimization method for single subjects and groups using the .optimize() method
  • Maximum likelihood optimization.
  • Many bugfixes and better test coverage.
  • hddm_fit.py command line utility is depracated.

Logo Sparse PCA 2.0

by tbuehler - December 31, 2015, 16:24:42 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6146 views, 1451 downloads, 3 subscriptions

About: A Matlab implementation of Sparse PCA using the inverse power method for nonlinear eigenproblems.

Changes:
  • Added deflation scheme to compute multiple principal components
  • Several internal runtime and memory optimizations
  • API change: sparsePCA.m is now used to compute multiple components; use computeTradeOffCurve.m to reproduce the examples in the NIPS paper

Showing Items 291-300 of 624 on page 30 of 63: First Previous 25 26 27 28 29 30 31 32 33 34 35 Next Last