Projects that are tagged with sparse learning.


Logo Salad 0.6.0

by chwress - December 1, 2015, 16:17:35 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10226 views, 1924 downloads, 3 subscriptions

About: A Content Anomaly Detector based on n-Grams

Changes:

After a full year of development we proudly present you several new features, plenty of bug fixes and better performance :)

  • It now is possible to process data on bit granularity salad [train|inspect] --binary
  • Performance improvements while simultaneously preserving and further advancing readability of the source code.
  • Suppress the verbose output of Salad salad [train|predict] -q
  • Extend the (unit) testing framework to support test of the overall application and memchecks using valgrind.
  • Testing mode was renamed: salad dbg -> salad test
  • Allow to select either client or server-side data when processing network communication.
  • libfoodstoragebox A library encapsulating advanced data structures such as bloom filters.
  • Fixes for a critical bug when using group input and several minor issues.
  • An optionally compressed, text-based model file format salad train -F (txt|archive)
  • The default hashset ('simple2') makes use of djb2 hash
  • Flawless builds using gcc, mingw and clang

Logo Probabilistic Classification Vector Machine 0.22

by fmschleif - November 10, 2015, 13:16:19 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 4799 views, 1066 downloads, 3 subscriptions

About: PCVM library a c++/armadillo implementation of the Probabilistic Classification Vector Machine.

Changes:

30.10.2015 * code has been revised in some places fixing also some errors different multiclass schemes and hdf5 file support added. Some speed ups and memory savings by better handling of intermediate objects.

27.05.2015: - Matlab binding under Windows available. Added a solution file for VS'2013 express to compile a matlab mex binding. Can not yet confirm that under windows the code is really using multiple cores (under linux it does)

29.04.2015 * added an implementation of the Nystroem based PCVM includes: Nystroem based singular value decomposition (SVD), eigenvalue decomposition (EVD) and pseudo-inverse calculation (PINV)

22.04.2015 * implementation of the PCVM released


Logo SALSA.jl 0.0.5

by jumutc - September 28, 2015, 17:28:56 CET [ Project Homepage BibTeX Download ] 1205 views, 241 downloads, 1 subscription

About: SALSA (Software lab for Advanced machine Learning with Stochastic Algorithms) is an implementation of the well-known stochastic algorithms for Machine Learning developed in the high-level technical computing language Julia. The SALSA software package is designed to address challenges in sparse linear modelling, linear and non-linear Support Vector Machines applied to large data samples with user-centric and user-friendly emphasis.

Changes:

Initial Announcement on mloss.org.


About: Learns dynamic network changes across conditions and visualize the results in Cytoscape.

Changes:

Initial Announcement on mloss.org.


Logo DAL 1.1

by ryota - February 18, 2014, 19:07:06 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 18183 views, 2957 downloads, 1 subscription

About: DAL is an efficient and flexibible MATLAB toolbox for sparse/low-rank learning/reconstruction based on the dual augmented Lagrangian method.

Changes:
  • Supports weighted lasso (dalsqal1.m, dallral1.m)
  • Supports weighted squared loss (dalwl1.m)
  • Bug fixes (group lasso and elastic-net-regularized logistic regression)

About: The glm-ie toolbox contains scalable estimation routines for GLMs (generalised linear models) and SLMs (sparse linear models) as well as an implementation of a scalable convex variational Bayesian inference relaxation. We designed the glm-ie package to be simple, generic and easily expansible. Most of the code is written in Matlab including some MEX files. The code is fully compatible to both Matlab 7.x and GNU Octave 3.2.x. Probabilistic classification, sparse linear modelling and logistic regression are covered in a common algorithmical framework allowing for both MAP estimation and approximate Bayesian inference.

Changes:

added factorial mean field inference as a third algorithm complementing expectation propagation and variational Bayes

generalised non-Gaussian potentials so that affine instead of linear functions of the latent variables can be used


Logo FABIA 2.8.0

by hochreit - October 18, 2013, 10:14:57 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 13638 views, 2840 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 1 vote)

About: FABIA is a biclustering algorithm that clusters rows and columns of a matrix simultaneously. Consequently, members of a row cluster are similar to each other on a subset of columns and, analogously, members of a column cluster are similar to each other on a subset of rows. Biclusters are found by factor analysis where both the factors and the loading matrix are sparse. FABIA is a multiplicative model that extracts linear dependencies between samples and feature patterns. Applications include detection of transcriptional modules in gene expression data and identification of haplotypes/>identity by descent< consisting of rare variants obtained by next generation sequencing.

Changes:

CHANGES IN VERSION 2.8.0

NEW FEATURES

o rescaling of lapla
o extractPlot does not plot sorted matrices

CHANGES IN VERSION 2.4.0

o spfabia bugfixes

CHANGES IN VERSION 2.3.1

NEW FEATURES

o Getters and setters for class Factorization

2.0.0:

  • spfabia: fabia for a sparse data matrix (in sparse matrix format) and sparse vector/matrix computations in the code to speed up computations. spfabia applications: (a) detecting >identity by descent< in next generation sequencing data with rare variants, (b) detecting >shared haplotypes< in disease studies based on next generation sequencing data with rare variants;
  • fabia for non-negative factorization (parameter: non_negative);
  • changed to C and removed dependencies to Rcpp;
  • improved update for lambda (alpha should be smaller, e.g. 0.03);
  • introduced maximal number of row elements (lL);
  • introduced cycle bL when upper bounds nL or lL are effective;
  • reduced computational complexity;
  • bug fixes: (a) update formula for lambda: tighter approximation, (b) corrected inverse of the conditional covariance matrix of z;

1.4.0:

  • New option nL: maximal number of biclusters per row element;
  • Sort biclusters according to information content;
  • Improved and extended preprocessing;
  • Update to R2.13

Logo Linear SVM with general regularization 1.0

by rflamary - October 5, 2012, 15:34:21 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 4667 views, 1346 downloads, 1 subscription

About: This package is an implementation of a linear svm solver with a wide class of regularizations on the svm weight vector (l1, l2, mixed norm l1-lq, adaptive lasso). We provide solvers for the classical single task svm problem and for multi-task with joint feature selection or similarity promoting term.

Changes:

Initial Announcement on mloss.org.


Logo Sparse MultiTask Learning Toolbox 1.2

by rflamary - March 18, 2012, 11:31:00 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 5184 views, 1191 downloads, 1 subscription

About: This package is a set of Matlab scripts that implements the algorithms described in the submitted paper: "Lp-Lq Sparse Linear and Sparse Multiple Kernel MultiTask Learning".

Changes:

Initial Announcement on mloss.org.


About: Matlab implementation of variational gaussian approximate inference for Bayesian Generalized Linear Models.

Changes:

Code restructure and bug fix.


About: The package estimates the matrix of partial correlations based on different regularized regression methods: lasso, adaptive lasso, PLS, and Ridge Regression.

Changes:

Initial Announcement on mloss.org.