Project details for DAL

Logo DAL 0.97

by ryota - April 13, 2009, 09:39:59 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ]

view (2 today), download ( 0 today ), 1 subscription

  • DAL is an efficient and flexibible toolbox for solving the following optimization problem:

    minimize f(Ax) + lambda*c(x)

    where A (m x n) is a design matrix, f is a loss function, and c is a measure of sparsity.

  • DAL can handle your favorite (convex, smooth) loss functions.

  • DAL can handle A (and its transpose) provided as function handles.

  • DAL can handle several "sparsity" measures in an unified way. Currently L1 and grouped L1 measures are supported.

  • DAL is efficient when m<<n (m: #samples, n: #unknowns) or the matrix A is poorly conditioned.

  • DAL is fast when the solution is sparse but the matrix A can be dense.

  • DAL is written in MATLAB.

Changes to previous version:

Initial Announcement on

BibTeX Entry: Download
Corresponding Paper BibTeX Entry: Download
URL: Project Homepage
Supported Operating Systems: Agnostic
Data Formats: Binary
Tags: Optimization, Group Lasso, Lasso, Sparse Learning
Archive: download here

Other available revisons

Version Changelog Date
  • Supports weighted lasso (dalsqal1.m, dallral1.m)
  • Supports weighted squared loss (dalwl1.m)
  • Bug fixes (group lasso and elastic-net-regularized logistic regression)
February 18, 2014, 19:07:06
  • 35% faster group lasso.
  • Sparse connectivity inference example added (s_test_hsgl.m).
  • Non-negative lasso (thanks to Shigeyuki Oba).
  • Uses Mark Tygert's pca.m for SVD (PROPACK is not required anymore).
May 3, 2011, 07:00:43
  • Logistic loss: : dallrl1.m, dallrgl.m, dallrds.m
  • Unequal-sized blocks supported in Group lasso regularization
  • eta: initial eta=0.01/lambda
  • dallrds.m: trace-norm regularized logistic regression (requires PROPACK)
December 14, 2009, 09:43:50

Initial Announcement on

April 13, 2009, 09:39:59


No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.