All entries.
Showing Items 391-400 of 539 on page 40 of 54: First Previous 35 36 37 38 39 40 41 42 43 44 45 Next Last

Logo Multilinear Principal Component Analysis 1.2 1.2

by openpr_nlpr - April 16, 2012, 09:04:08 CET [ Project Homepage BibTeX Download ] 1780 views, 541 downloads, 1 subscription

About: This archive contains a Matlab implementation of the Multilinear Principal Component Analysis (MPCA) algorithm and MPCA+LDA, as described in the paper Haiping Lu, K.N. Plataniotis, and A.N. Venetsanopoulos, "MPCA: Multilinear Principal Component Analysis of Tensor Objects", IEEE Transactions on Neural Networks, Vol. 19, No. 1, Page: 18-39, January 2008.

Changes:

Initial Announcement on mloss.org.


Logo treelearn 1

by iskander - September 21, 2011, 16:12:27 CET [ Project Homepage BibTeX Download ] 2270 views, 541 downloads, 1 subscription

About: A python implementation of Breiman's Random Forests.

Changes:

Initial Announcement on mloss.org.


Logo DCABags 0.7

by wbuntine - June 5, 2014, 05:34:44 CET [ Project Homepage BibTeX Download ] 2396 views, 540 downloads, 4 subscriptions

About: Document/Text preprocessing for topic models: suite of Perl scripts for preprocessing text collections to create dictionaries and bag/list files for use by topic modelling software.

Changes:

Moved distribution and code across to GitHub. Changed "ldac" format to have 0 offset for word indices. Added "document frequency" (df) filtering on selection of tokens for linkTables. Playing with linkParse but its still unuseable generally.


Logo OpenGM 2 2.0.2 beta

by opengm - June 1, 2012, 14:33:53 CET [ Project Homepage BibTeX Download ] 2387 views, 540 downloads, 1 subscription

About: A C++ Library for Discrete Graphical Models

Changes:

Initial Announcement on mloss.org.


Logo Graphical Models and Conditional Random Fields Toolbox 2

by jdomke - January 5, 2012, 15:38:20 CET [ Project Homepage BibTeX Download ] 2249 views, 537 downloads, 1 subscription

About: This is a Matlab/C++ "toolbox" of code for learning and inference with graphical models. It is focused on parameter learning using marginalization in the high-treewidth setting.

Changes:

Initial Announcement on mloss.org.


Logo OpenANN 1.1.0

by afabisch - September 26, 2013, 23:52:03 CET [ Project Homepage BibTeX Download ] 2512 views, 533 downloads, 2 subscriptions

About: A library for artificial neural networks.

Changes:

Added algorithms:

  • L-BFGS optimizer
  • k-means
  • sparse auto-encoder
  • preprocessing: normalization, PCA, ZCA whitening

Logo r-cran-penalizedSVM 1.1

by r-cran-robot - August 2, 2010, 00:00:00 CET [ Project Homepage BibTeX Download ] 2574 views, 532 downloads, 0 subscriptions

About: Feature Selection SVM using penalty functions

Changes:

Fetched by r-cran-robot on 2013-04-01 00:00:07.509844


Logo Hivemall 0.1

by myui - October 25, 2013, 08:43:12 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3420 views, 531 downloads, 1 subscription

About: Hivemall is a scalable machine learning library running on Hive/Hadoop, licensed under the LGPL 2.1.

Changes:
  • Enhancement

    • Added AROW regression
    • Added AROW with a hinge loss (arowh_regress())
  • Bugfix

    • Fixed a bug of null feature handling in classification/regression

Logo BayesPy 0.2.1

by jluttine - September 30, 2014, 16:35:11 CET [ Project Homepage BibTeX Download ] 1805 views, 529 downloads, 3 subscriptions

About: Variational Bayesian inference tools for Python

Changes:
  • Add workaround for matplotlib 1.4.0 bug related to interactive mode which affected monitoring

  • Fix bugs in Hinton diagrams for Gaussian variables


About: In this paper, we propose an improved principal component analysis based on maximum entropy (MaxEnt) preservation, called MaxEnt-PCA, which is derived from a Parzen window estimation of Renyi’s quadratic entropy. Instead of minimizing the reconstruction error either based on L2-norm or L1-norm, the MaxEnt-PCA attempts to preserve as much as possible the uncertainty information of the data measured by entropy. The optimal solution of MaxEnt-PCA consists of the eigenvectors of a Laplacian probability matrix corresponding to the MaxEnt distribution. MaxEnt-PCA (1) is rotation invariant, (2) is free from any distribution assumption, and (3) is robust to outliers. Extensive experiments on real-world datasets demonstrate the effectiveness of the proposed linear method as compared to other related robust PCA methods.

Changes:

Initial Announcement on mloss.org.


Showing Items 391-400 of 539 on page 40 of 54: First Previous 35 36 37 38 39 40 41 42 43 44 45 Next Last