Projects running under linux.
Showing Items 21-40 of 255 on page 2 of 13: Previous 1 2 3 4 5 6 7 Next Last

Logo XGBoost v0.4.0

by crowwork - May 12, 2015, 08:57:16 CET [ Project Homepage BibTeX Download ] 9130 views, 1787 downloads, 3 subscriptions

About: xgboost: eXtreme Gradient Boosting It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithm. The package can automatically do parallel computation with OpenMP, and it can be more than 10 times faster than existing gradient boosting packages such as gbm or sklearn.GBM . It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that user are also allowed to define there own objectives easily. The newest version of xgboost now supports distributed learning on various platforms such as hadoop, mpi and scales to even larger problems

  • Distributed version of xgboost that runs on YARN, scales to billions of examples

  • Direct save/load data and model from/to S3 and HDFS

  • Feature importance visualization in R module, by Michael Benesty

  • Predict leaf index

  • Poisson regression for counts data

  • Early stopping option in training

  • Native save load support in R and python

  • xgboost models now can be saved using save/load in R

  • xgboost python model is now pickable

  • sklearn wrapper is supported in python module

  • Experimental External memory version

Logo lomo feature extraction and xqda metric learning for person reidentification 1.0

by openpr_nlpr - May 6, 2015, 11:38:32 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1151 views, 177 downloads, 3 subscriptions

Rating Empty StarEmpty StarEmpty StarEmpty StarEmpty Star
(based on 1 vote)

About: This MATLAB package provides the LOMO feature extraction and the XQDA metric learning algorithms proposed in our CVPR 2015 paper. It is fast, and effective for person re-identification. For more details, please visit


Initial Announcement on

Logo streamDM 0.0.1

by abifet - April 28, 2015, 12:34:00 CET [ Project Homepage BibTeX Download ] 900 views, 371 downloads, 1 subscription

About: streamDM is a new open source data mining and machine learning library, designed on top of Spark Streaming, an extension of the core Spark API that enables scalable, high-throughput, fault-tolerant stream processing of data streams.


Initial Announcement on

Logo BLOG 0.9.1

by jxwuyi - April 27, 2015, 06:52:05 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1172 views, 247 downloads, 3 subscriptions

About: Bayesian Logic (BLOG) is a probabilistic modeling language. It is designed for representing relations and uncertainties among real world objects.


Initial Announcement on

Logo FsAlg 0.5.4

by gbaydin - April 25, 2015, 02:11:03 CET [ Project Homepage BibTeX Download ] 806 views, 246 downloads, 1 subscription

About: FsAlg is a linear algebra library that supports generic types.


Initial Announcement on

Logo java machine learning platform 1.0

by openpr_nlpr - April 2, 2015, 09:02:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1178 views, 211 downloads, 2 subscriptions

About: Jmlp is a java platform for both of the machine learning experiments and application. I have tested it on the window platform. But it should be applicable in the linux platform due to the cross-platform of Java language. It contains the classical classification algorithm (Discrete AdaBoost.MH, Real AdaBoost.MH, SVM, KNN, MCE,MLP,NB) and feature reduction(KPCA,PCA,Whiten) etc.


Initial Announcement on

Logo Harry 0.4.0

by konrad - March 30, 2015, 14:03:12 CET [ Project Homepage BibTeX Download ] 5222 views, 1130 downloads, 2 subscriptions

About: A Tool for Measuring String Similarity


The new release supports measuring string similarity at the granularity of bytes, bits and tokens. A Python interface has been added. Several minor bugs have been fixed.

Logo Theano 0.7

by jaberg - March 27, 2015, 16:40:18 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 18753 views, 3474 downloads, 3 subscriptions

About: A Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. Dynamically generates CPU and GPU modules for good performance. Deep Learning Tutorials illustrate deep learning with Theano.


Theano 0.7 (26th of March, 2015)

We recommend to everyone to upgrade to this version.


* Integration of CuDNN for 2D convolutions and pooling on supported GPUs
* Too many optimizations and new features to count
* Various fixes and improvements to scan
* Better support for GPU on Windows
* On Mac OS X, clang is used by default
* Many crash fixes
* Some bug fixes as well

Logo JMLR Sally 1.0.0

by konrad - March 26, 2015, 17:01:35 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 30268 views, 5913 downloads, 3 subscriptions

About: A Tool for Embedding Strings in Vector Spaces


Support for explicit selection of granularity added. Several minor bug fixes. We have reached 1.0

Logo Hivemall 0.3

by myui - March 13, 2015, 17:08:22 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6840 views, 1143 downloads, 3 subscriptions

About: Hivemall is a scalable machine learning library running on Hive/Hadoop.

  • Supported Matrix Factorization
  • Added a support for TF-IDF computation
  • Supported AdaGrad/AdaDelta
  • Supported AdaGradRDA classification
  • Added normalization scheme

Logo libcmaes 0.9.5

by beniz - March 9, 2015, 09:05:22 CET [ Project Homepage BibTeX Download ] 6421 views, 1311 downloads, 3 subscriptions

About: Libcmaes is a multithreaded C++11 library (with Python bindings) for high performance blackbox stochastic optimization of difficult, possibly non-linear and non-convex functions, using the CMA-ES algorithm for Covariance Matrix Adaptation Evolution Strategy. Libcmaes is useful to minimize / maximize any function, without information regarding gradient or derivability.


This is a major release, with several novelties, improvements and fixes, among which:

  • step-size two-point adaptaion scheme for improved performances in some settings, ref #88

  • important bug fixes to the ACM surrogate scheme, ref #57, #106

  • simple high-level workflow under Python, ref #116

  • improved performances in high dimensions, ref #97

  • improved profile likelihood and contour computations, including under geno/pheno transforms, ref #30, #31, #48

  • elitist mechanism for forcing best solutions during evolution, ref 103

  • new legacy plotting function, ref #110

  • optional initial function value, ref #100

  • improved C++ API, ref #89

  • Python bindings support with Anaconda, ref #111

  • configure script now tries to detect numpy when building Python bindings, ref #113

  • Python bindings now have embedded documentation, ref #114

  • support for Travis continuous integration, ref #122

  • lower resolution random seed initialization

Logo CN24 Convolutional Neural Networks for Semantic Segmentation 1.0

by erik - February 23, 2015, 09:02:06 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1466 views, 301 downloads, 1 subscription

About: CN24 is a complete semantic segmentation framework using fully convolutional networks.


Initial Announcement on

Logo JMLR DLLearner 1.0

by Jens - February 13, 2015, 11:39:46 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 17822 views, 4340 downloads, 6 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 3 votes)

About: The DL-Learner framework contains several algorithms for supervised concept learning in Description Logics (DLs) and OWL.



Logo Auto encoder Based Data Clustering Toolkit 1.0

by openpr_nlpr - February 10, 2015, 08:30:55 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1344 views, 236 downloads, 2 subscriptions

About: The auto-encoder based data clustering toolkit provides a quick start of clustering based on deep auto-encoder nets. This toolkit can cluster data in feature space with a deep nonlinear nets.


Initial Announcement on

Logo Histogram of Oriented Gradient 1.0

by openpr_nlpr - February 10, 2015, 08:27:55 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1097 views, 212 downloads, 2 subscriptions

About: This is an exact implementation of Histogram of Oriented Gradient as mentioned in the paper by Dalal.


Initial Announcement on

Logo JMLR Information Theoretical Estimators 0.61

by szzoli - February 8, 2015, 14:04:27 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 77773 views, 15490 downloads, 2 subscriptions

About: ITE (Information Theoretical Estimators) is capable of estimating many different variants of entropy, mutual information, divergence, association measures, cross quantities and kernels on distributions. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems.

  • Explicit additive constant computation in generalized kNN based Renyi entropy estimators: enhancement suggestion has been added.

  • Analytical value computation of the exponentiated Jensen-Renyi kernel-2: simplified.

Logo JMLR SHOGUN 4.0.0

by sonne - February 5, 2015, 09:09:37 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 98855 views, 14000 downloads, 6 subscriptions

Rating Whole StarWhole StarWhole StarEmpty StarEmpty Star
(based on 6 votes)

About: The SHOGUN machine learning toolbox's focus is on large scale learning methods with focus on Support Vector Machines (SVM), providing interfaces to python, octave, matlab, r and the command line.


This release features the work of our 8 GSoC 2014 students [student; mentors]:

  • OpenCV Integration and Computer Vision Applications [Abhijeet Kislay; Kevin Hughes]
  • Large-Scale Multi-Label Classification [Abinash Panda; Thoralf Klein]
  • Large-scale structured prediction with approximate inference [Jiaolong Xu; Shell Hu]
  • Essential Deep Learning Modules [Khaled Nasr; Sergey Lisitsyn, Theofanis Karaletsos]
  • Fundamental Machine Learning: decision trees, kernel density estimation [Parijat Mazumdar ; Fernando Iglesias]
  • Shogun Missionary & Shogun in Education [Saurabh Mahindre; Heiko Strathmann]
  • Testing and Measuring Variable Interactions With Kernels [Soumyajit De; Dino Sejdinovic, Heiko Strathmann]
  • Variational Learning for Gaussian Processes [Wu Lin; Heiko Strathmann, Emtiyaz Khan]

It also contains several cleanups and bugfixes:


  • New Shogun project description [Heiko Strathmann]
  • ID3 algorithm for decision tree learning [Parijat Mazumdar]
  • New modes for PCA matrix factorizations: SVD & EVD, in-place or reallocating [Parijat Mazumdar]
  • Add Neural Networks with linear, logistic and softmax neurons [Khaled Nasr]
  • Add kernel multiclass strategy examples in multiclass notebook [Saurabh Mahindre]
  • Add decision trees notebook containing examples for ID3 algorithm [Parijat Mazumdar]
  • Add sudoku recognizer ipython notebook [Alejandro Hernandez]
  • Add in-place subsets on features, labels, and custom kernels [Heiko Strathmann]
  • Add Principal Component Analysis notebook [Abhijeet Kislay]
  • Add Multiple Kernel Learning notebook [Saurabh Mahindre]
  • Add Multi-Label classes to enable Multi-Label classification [Thoralf Klein]
  • Add rectified linear neurons, dropout and max-norm regularization to neural networks [Khaled Nasr]
  • Add C4.5 algorithm for multiclass classification using decision trees [Parijat Mazumdar]
  • Add support for arbitrary acyclic graph-structured neural networks [Khaled Nasr]
  • Add CART algorithm for classification and regression using decision trees [Parijat Mazumdar]
  • Add CHAID algorithm for multiclass classification and regression using decision trees [Parijat Mazumdar]
  • Add Convolutional Neural Networks [Khaled Nasr]
  • Add Random Forests algorithm for ensemble learning using CART [Parijat Mazumdar]
  • Add Restricted Botlzmann Machines [Khaled Nasr]
  • Add Stochastic Gradient Boosting algorithm for ensemble learning [Parijat Mazumdar]
  • Add Deep contractive and denoising autoencoders [Khaled Nasr]
  • Add Deep belief networks [Khaled Nasr]


  • Fix reference counting bugs in CList when reference counting is on [Heiko Strathmann, Thoralf Klein, lambday]
  • Fix memory problem in PCA::apply_to_feature_matrix [Parijat Mazumdar]
  • Fix crash in LeastAngleRegression for the case D greater than N [Parijat Mazumdar]
  • Fix memory violations in bundle method solvers [Thoralf Klein]
  • Fix fail in library_mldatahdf5.cpp example when is not working properly [Parijat Mazumdar]
  • Fix memory leaks in Vowpal Wabbit, LibSVMFile and KernelPCA [Thoralf Klein]
  • Fix memory and control flow issues discovered by Coverity [Thoralf Klein]
  • Fix R modular interface SWIG typemap (Requires SWIG >= 2.0.5) [Matt Huska]

Cleanup and API Changes

  • PCA now depends on Eigen3 instead of LAPACK [Parijat Mazumdar]
  • Removing redundant and fixing implicit imports [Thoralf Klein]
  • Hide many methods from SWIG, reducing compile memory by 500MiB [Heiko Strathmann, Fernando Iglesias, Thoralf Klein]

Logo Distributed Frank Wolfe Algorithm 0.02

by alirezabagheri - January 28, 2015, 00:35:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1281 views, 378 downloads, 2 subscriptions

About: Distributed optimization: Support Vector Machines and LASSO regression on distributed data


Initial Upload

Logo fertilized forests 1.0beta

by Chrisl_S - January 23, 2015, 16:04:31 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1237 views, 300 downloads, 1 subscription

About: The fertilized forests project has the aim to provide an easy to use, easy to extend, yet fast library for decision forests. It summarizes the research in this field and provides a solid platform to extend it. Offering consistent interfaces to C++, Python and Matlab and being available for all major compilers gives the user high flexibility for using the library.


Initial Announcement on

Logo Hub Miner 1.1

by nenadtomasev - January 22, 2015, 16:33:51 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2272 views, 483 downloads, 2 subscriptions

About: Hubness-aware Machine Learning for High-dimensional Data

  • BibTex support for all algorithm implementations, making all of them easy to reference (via algref package).

  • Two more hubness-aware approaches (meta-metric-learning and feature construction)

  • An implementation of Hit-Miss networks for analysis.

  • Several minor bug fixes.

  • The following instance selection methods were added: HMScore, Carving, Iterative Case Filtering, ENRBF.

  • The following clustering quality indexes were added: Folkes-Mallows, Calinski-Harabasz, PBM, G+, Tau, Point-Biserial, Hubert's statistic, McClain-Rao, C-root-k.

  • Some more experimental scripts have been included.

  • Extensions in the estimation of hubness risk.

  • Alias and weighted reservoir methods for weight-proportional random selection.

Showing Items 21-40 of 255 on page 2 of 13: Previous 1 2 3 4 5 6 7 Next Last