mloss.org Linear SVM with general regularizationhttp://mloss.orgUpdates and additions to Linear SVM with general regularizationenFri, 05 Oct 2012 15:34:21 -0000Linear SVM with general regularization 1.0http://mloss.org/software/view/426/<html><p>We provide a general solver for squared hinge loss SVM of the form: </p> <p>min_{w,b} sum_i max(0,y_i(x_i^Tw+b))^2 + Omega(w) </p> <p>where Omega(w) can be : </p> <ul> <li> l1 : Omega(w)=sum_i |w_i| </li> <li> l2 : Omega(w)=sum_i |w_i|^2 </li> <li> l1-l2: Omega(w)=sum_g ||w_g||_2 </li> <li> l1-lp: Omega(w)=sum_g ||w_g||_p </li> <li> adaptive l1-l2: Omega(w)=sum_g beta_g||w_g||_2 </li> </ul> <p>We also provide a multitask solver where T tasks can be learned simultaneously with joint sparsity constraints (mixed norm regularization). </p> <p>Note that this toolbox has been designed to be efficient for dense data whereas most of the existing linear svm solvers have been designed for sparse datasets. </p></html>remi flamary,, alain rakotomamonjyFri, 05 Oct 2012 15:34:21 -0000http://mloss.org/software/rss/comments/426http://mloss.org/software/view/426/large scalekernelmachinesvmbciclassificationsupport vector machinesfeature selectionlinear svmconvex optimizationgradient based learningmanifold learningoptimizationalgorithmsfeature weightingtrace normtoolboxgroup lassolassosparse learningquadratic programmingweightingl1 regularizationlarge datasetsregularizationpattern recognitiondiscriminant analysislinear modelgeneralized linear modelsmulticlass support vector machinel1 minimizationsparse representationl1 norml21 normdimension reductionmulti task