All entries.
Showing Items 511-520 of 567 on page 52 of 57: First Previous 47 48 49 50 51 52 53 54 55 56 57 Next

Logo r-cran-longRPart 1.0

by r-cran-robot - March 7, 2008, 00:00:00 CET [ Project Homepage BibTeX Download ] 1258 views, 227 downloads, 0 subscriptions

About: Recursive partitioning of longitudinal data using mixed-effects models

Changes:

Fetched by r-cran-robot on 2013-04-01 00:00:06.201307


Logo ChaLearn Gesture Challenge Turtle Tamers 1.0

by konkey - March 17, 2013, 18:39:22 CET [ BibTeX Download ] 1257 views, 494 downloads, 1 subscription

About: Soltion developed by team Turtle Tamers in the ChaLearn Gesture Challenge (http://www.kaggle.com/c/GestureChallenge2)

Changes:

Initial Announcement on mloss.org.


Logo Calculate Normalized Information Measures 1.0.0

by openpr_nlpr - December 2, 2011, 04:35:32 CET [ Project Homepage BibTeX Download ] 1256 views, 396 downloads, 1 subscription

About: The toolbox is to calculate normalized information measures from a given m by (m+1) confusion matrix for objective evaluations of an abstaining classifier. It includes total 24 normalized information measures based on three groups of definitions, that is, mutual information, information divergence, and cross entropy.

Changes:

Initial Announcement on mloss.org.


Logo Layer Based Dependency Parser 1.0.0

by openpr_nlpr - December 2, 2011, 04:51:23 CET [ Project Homepage BibTeX Download ] 1250 views, 411 downloads, 1 subscription

About: LDPar is an efficient data-driven dependency parser. You can train your own parsing model on treebank data and parse new data using the induced model.

Changes:

Initial Announcement on mloss.org.


Logo r-cran-obliqueRF 0.2

by r-cran-robot - September 7, 2011, 00:00:00 CET [ Project Homepage BibTeX Download ] 1244 views, 255 downloads, 0 subscriptions

About: Oblique Random Forests from Recursive Linear Model Splits

Changes:

Fetched by r-cran-robot on 2012-08-01 00:00:07.607823


Logo Semi Stochastic Gradient Descent 1.0

by konkey - July 9, 2014, 04:28:47 CET [ BibTeX BibTeX for corresponding Paper Download ] 1236 views, 318 downloads, 1 subscription

About: Efficient implementation of Semi-Stochastic Gradient Descent algorithm (S2GD) for training logistic regression (L2-regularized).

Changes:

Initial Announcement on mloss.org.


Logo GPgrid toolkit for fast GP analysis on grid input 0.1

by ejg20 - September 16, 2013, 18:01:16 CET [ BibTeX Download ] 1194 views, 412 downloads, 1 subscription

About: GPgrid toolkit for fast GP analysis on grid input

Changes:

Initial Announcement on mloss.org.


Logo IPCA v0.1

by kiraly - July 7, 2014, 10:25:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1181 views, 254 downloads, 1 subscription

About: This package implements Ideal PCA in MATLAB. Ideal PCA is a (cross-)kernel based feature extraction algorithm which is (a) a faster alternative to kernel PCA and (b) a method to learn data manifold certifying features.

Changes:

Initial Announcement on mloss.org.


Logo ABACOC Adaptive Ball Cover for Classification 1.0

by kikot - July 14, 2014, 16:27:03 CET [ BibTeX BibTeX for corresponding Paper Download ] 1177 views, 328 downloads, 3 subscriptions

About: Online Action Recognition via Nonparametric Incremental Learning. Java and Matlab code already available. A Python version and the Java source code will be released soon.

Changes:

Initial release of the library, future changes will be advertised shortly.


About: Robust sparse representation has shown significant potential in solving challenging problems in computer vision such as biometrics and visual surveillance. Although several robust sparse models have been proposed and promising results have been obtained, they are either for error correction or for error detection, and learning a general framework that systematically unifies these two aspects and explore their relation is still an open problem. In this paper, we develop a half-quadratic (HQ) framework to solve the robust sparse representation problem. By defining different kinds of half-quadratic functions, the proposed HQ framework is applicable to performing both error correction and error detection. More specifically, by using the additive form of HQ, we propose an L1-regularized error correction method by iteratively recovering corrupted data from errors incurred by noises and outliers; by using the multiplicative form of HQ, we propose an L1-regularized error detection method by learning from uncorrupted data iteratively. We also show that the L1-regularization solved by soft-thresholding function has a dual relationship to Huber M-estimator, which theoretically guarantees the performance of robust sparse representation in terms of M-estimation. Experiments on robust face recognition under severe occlusion and corruption validate our framework and findings.

Changes:

Initial Announcement on mloss.org.


Showing Items 511-520 of 567 on page 52 of 57: First Previous 47 48 49 50 51 52 53 54 55 56 57 Next