All entries.
Showing Items 171-180 of 609 on page 18 of 61: First Previous 13 14 15 16 17 18 19 20 21 22 23 Next Last

Logo GraphLab v1-1908

by dannybickson - November 22, 2011, 12:50:00 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6540 views, 1094 downloads, 1 subscription

About: Multicore/distributed large scale machine learning framework.

Changes:

Update version.


Logo GritBot 2.01

by zenog - September 2, 2011, 14:56:26 CET [ Project Homepage BibTeX Download ] 3193 views, 828 downloads, 1 subscription

About: GritBot is an data cleaning and outlier/anomaly detection program.

Changes:

Initial Announcement on mloss.org.


Logo gWT graph indexing wavelet tree 1.0.0

by ytabei - May 12, 2011, 23:01:17 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 4034 views, 792 downloads, 1 subscription

About: Software for graph similarity search for massive graph databases

Changes:

Initial Announcement on mloss.org.


About: Robust sparse representation has shown significant potential in solving challenging problems in computer vision such as biometrics and visual surveillance. Although several robust sparse models have been proposed and promising results have been obtained, they are either for error correction or for error detection, and learning a general framework that systematically unifies these two aspects and explore their relation is still an open problem. In this paper, we develop a half-quadratic (HQ) framework to solve the robust sparse representation problem. By defining different kinds of half-quadratic functions, the proposed HQ framework is applicable to performing both error correction and error detection. More specifically, by using the additive form of HQ, we propose an L1-regularized error correction method by iteratively recovering corrupted data from errors incurred by noises and outliers; by using the multiplicative form of HQ, we propose an L1-regularized error detection method by learning from uncorrupted data iteratively. We also show that the L1-regularization solved by soft-thresholding function has a dual relationship to Huber M-estimator, which theoretically guarantees the performance of robust sparse representation in terms of M-estimation. Experiments on robust face recognition under severe occlusion and corruption validate our framework and findings.

Changes:

Initial Announcement on mloss.org.


Logo hapFabia 1.4.2

by hochreit - December 28, 2013, 17:24:29 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 4848 views, 975 downloads, 1 subscription

About: hapFabia is an R package for identification of very short segments of identity by descent (IBD) characterized by rare variants in large sequencing data. It detects 100 times smaller segments than previous methods.

Changes:

o citation update

o plot function improved


Logo Harry 0.4.1

by konrad - January 3, 2016, 14:56:55 CET [ Project Homepage BibTeX Download ] 6807 views, 1497 downloads, 3 subscriptions

About: A Tool for Measuring String Similarity

Changes:

Minor bug fixes for libarchive code


Logo hca 0.61

by wbuntine - September 10, 2014, 03:33:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12832 views, 2087 downloads, 4 subscriptions

About: Multi-core non-parametric and bursty topic models (HDP-LDA, DCMLDA, and other variants of LDA) implemented in C using efficient Gibbs sampling, with hyperparameter sampling and other flexible controls.

Changes:

Corrections to diagnostics and topic report. Correction to estimating alpha. Now estimating beta sometimes (when estimating phi).


Logo hcluster 0.2.0

by damianeads - December 14, 2008, 14:03:49 CET [ Project Homepage BibTeX Download ] 3592 views, 943 downloads, 1 subscription

About: This library provides Python functions for agglomerative clustering. Its features include

Changes:

Initial Announcement on mloss.org.


Logo HDDM 0.5

by Wiecki - April 24, 2013, 02:53:07 CET [ Project Homepage BibTeX Download ] 5228 views, 1316 downloads, 1 subscription

About: HDDM is a python toolbox for hierarchical Bayesian parameter estimation of the Drift Diffusion Model (via PyMC). Drift Diffusion Models are used widely in psychology and cognitive neuroscience to study decision making.

Changes:
  • New and improved HDDM model with the following changes:
    • Priors: by default model will use informative priors (see http://ski.clps.brown.edu/hddm_docs/methods.html#hierarchical-drift-diffusion-models-used-in-hddm) If you want uninformative priors, set informative=False.
    • Sampling: This model uses slice sampling which leads to faster convergence while being slower to generate an individual sample. In our experiments, burnin of 20 is often good enough.
    • Inter-trial variablity parameters are only estimated at the group level, not for individual subjects.
    • The old model has been renamed to HDDMTransformed.
    • HDDMRegression and HDDMStimCoding are also using this model.
  • HDDMRegression takes patsy model specification strings. See http://ski.clps.brown.edu/hddm_docs/howto.html#estimate-a-regression-model and http://ski.clps.brown.edu/hddm_docs/tutorial_regression_stimcoding.html#chap-tutorial-hddm-regression
  • Improved online documentation at http://ski.clps.brown.edu/hddm_docs
  • A new HDDM demo at http://ski.clps.brown.edu/hddm_docs/demo.html
  • Ratcliff's quantile optimization method for single subjects and groups using the .optimize() method
  • Maximum likelihood optimization.
  • Many bugfixes and better test coverage.
  • hddm_fit.py command line utility is depracated.

Logo Hidden Markov Support Vector Machines 0.2

by pramod - April 16, 2010, 17:27:41 CET [ BibTeX Download ] 6169 views, 1610 downloads, 1 subscription

About: This software is an implementation of Hidden Markov Support Vector Machines (HMSVMs).

Changes:

Initial Announcement on mloss.org.


Showing Items 171-180 of 609 on page 18 of 61: First Previous 13 14 15 16 17 18 19 20 21 22 23 Next Last