Projects supporting the agnostic data format.
Showing Items 1-20 of 53 on page 1 of 3: 1 2 3 Next

Logo Optunity 1.1.1

by claesenm - September 30, 2015, 07:06:17 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3455 views, 878 downloads, 2 subscriptions

About: Optunity is a library containing various optimizers for hyperparameter tuning. Hyperparameter tuning is a recurrent problem in many machine learning tasks, both supervised and unsupervised.This package provides several distinct approaches to solve such problems including some helpful facilities such as cross-validation and a plethora of score functions.


This minor release has the same feature set as Optunity 1.1.0, but incorporates several bug fixes, mostly related to the specification of structured search spaces.

Logo DiffSharp 0.7.0

by gbaydin - September 29, 2015, 14:09:01 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3000 views, 622 downloads, 3 subscriptions

About: DiffSharp is an automatic differentiation (AD) library providing gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products. It allows exact and efficient calculation of derivatives, with support for nesting.


Version 0.7.0 is a reimplementation of the library with support for linear algebra primitives, BLAS/LAPACK, 32- and 64-bit precision and different CPU/GPU backends

Changed: Namespaces have been reorganized and simplified. This is a breaking change. There is now just one AD implementation, under DiffSharp.AD (with DiffSharp.AD.Float32 and DiffSharp.AD.Float64 variants, see below). This internally makes use of forward or reverse AD as needed.

Added: Support for 32 bit (single precision) and 64 bit (double precision) floating point operations. All modules have Float32 and Float64 versions providing the same functionality with the specified precision. 32 bit floating point operations are significantly faster (as much as twice as fast) on many current systems.

Added: DiffSharp now uses the OpenBLAS library by default for linear algebra operations. The AD operations with the types D for scalars, DV for vectors, and DM for matrices use the underlying linear algebra backend for highly optimized native BLAS and LAPACK operations. For non-BLAS operations (such as Hadamard products and matrix transpose), parallel implementations in managed code are used. All operations with the D, DV, and DM types support forward and reverse nested AD up to any level. This also paves the way for GPU backends (CUDA/CuBLAS) which will be introduced in following releases. Please see the documentation and API reference for information about how to use the D, DV, and DM types. (Deprecated: The FsAlg generic linear algebra library and the Vector<'T> and Matrix<'T> types are no longer used.)

Fixed: Reverse mode AD has been reimplemented in a tail-recursive way for better performance and preventing StackOverflow exceptions encountered in previous versions.

Changed: The library now uses F# 4.0 (FSharp.Core

Changed: The library is now 64 bit only, meaning that users should set "x64" as the platform target for all build configurations.

Fixed: Overall bug fixes.

Logo JMLR Darwin 1.9

by sgould - September 8, 2015, 06:50:37 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 39930 views, 8295 downloads, 4 subscriptions

About: A platform-independent C++ framework for machine learning, graphical models, and computer vision research and development.


Version 1.9:

  • Replaced drwnInPaint class with drwnImageInPainter class and added inPaint application
  • Added function to read CIFAR-10 and CIFAR-100 style datasets (see
  • Added drwnMaskedPatchMatch, drwnBasicPatchMatch, drwnSelfPatchMatch and basicPatchMatch application
  • drwnPatchMatchGraph now allows multiple matches to the same image
  • Upgraded wxWidgets to 3.0.2 (problems on Mac OS X)
  • Switched Mac OS X compilation to libc++ instead of libstdc++
  • Added Python scripts for running experiments and regression tests
  • Refactored drwnGrabCutInstance class to support both GMM and colour histogram model
  • Added cacheSortIndex to drwnDecisionTree for trading-off speed versus memory usage
  • Added mexLoadPatchMatchGraph for loading drwnPatchMatchGraph objects into Matlab
  • Improved documentation, other bug fixes and performance improvements

Logo Presage 0.9.1

by Dzmitry_Lahoda - August 18, 2015, 10:13:05 CET [ BibTeX Download ] 406 views, 115 downloads, 3 subscriptions

About: Presage is an intelligent predictive text entry platform.


Initial Announcement on

Logo FsAlg 0.5.4

by gbaydin - April 25, 2015, 02:11:03 CET [ Project Homepage BibTeX Download ] 808 views, 246 downloads, 1 subscription

About: FsAlg is a linear algebra library that supports generic types.


Initial Announcement on

Logo Blocks 0.1

by bartvm - March 30, 2015, 22:25:02 CET [ Project Homepage BibTeX Download ] 1000 views, 298 downloads, 3 subscriptions

About: A Theano framework for building and training neural networks


Initial Announcement on

Logo Theano 0.7

by jaberg - March 27, 2015, 16:40:18 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 18768 views, 3479 downloads, 3 subscriptions

About: A Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. Dynamically generates CPU and GPU modules for good performance. Deep Learning Tutorials illustrate deep learning with Theano.


Theano 0.7 (26th of March, 2015)

We recommend to everyone to upgrade to this version.


* Integration of CuDNN for 2D convolutions and pooling on supported GPUs
* Too many optimizations and new features to count
* Various fixes and improvements to scan
* Better support for GPU on Windows
* On Mac OS X, clang is used by default
* Many crash fixes
* Some bug fixes as well

Logo fertilized forests 1.0beta

by Chrisl_S - January 23, 2015, 16:04:31 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1237 views, 301 downloads, 1 subscription

About: The fertilized forests project has the aim to provide an easy to use, easy to extend, yet fast library for decision forests. It summarizes the research in this field and provides a solid platform to extend it. Offering consistent interfaces to C++, Python and Matlab and being available for all major compilers gives the user high flexibility for using the library.


Initial Announcement on

Logo Rabit 0.1.0

by crowwork - January 21, 2015, 18:48:46 CET [ Project Homepage BibTeX Download ] 879 views, 306 downloads, 1 subscription

About: rabit (Reliable Allreduce and Broadcast Interface) is a light weight library that provides a fault tolerant interface of Allreduce and Broadcast for portable , scalable and reliable distributed machine learning programs. Rabit programs can run on various platforms such as Hadoop, MPI and no installation is needed. Rabit now support kmeans clustering, and distributed xgboost: an extremely efficient disrtibuted boosted tree(GBDT) toolkit.


Initial Announcement on

Logo bayes scala 0.5-SNAPSHOT

by danielkorzekwa - January 9, 2015, 19:23:48 CET [ Project Homepage BibTeX Download ] 1135 views, 308 downloads, 2 subscriptions

About: It is a Scala library for building Bayesian Networks with discrete/continuous variables and running deterministic Bayesian inference


Initial Announcement on

Logo gaml 1.10

by frezza - January 8, 2015, 14:06:58 CET [ Project Homepage BibTeX Download ] 1038 views, 284 downloads, 2 subscriptions

About: C++ generic programming tools for machine learning


Initial Announcement on

Logo Accord.NET Framework 2.14.0

by cesarsouza - December 9, 2014, 23:04:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 22154 views, 4559 downloads, 2 subscriptions

About: The Accord.NET Framework is a .NET machine learning framework combined with audio and image processing libraries completely written in C#. It is a complete framework for building production-grade computer vision, computer audition, signal processing and statistics applications even for commercial use. A comprehensive set of sample applications provide a fast start to get up and running quickly, and an extensive online documentation helps fill in the details.


Adding a large number of new distributions, such as Anderson-Daring, Shapiro-Wilk, Inverse Chi-Square, Lévy, Folded Normal, Shifted Log-Logistic, Kumaraswamy, Trapezoidal, U-quadratic and BetaPrime distributions, Birnbaum-Saunders, Generalized Normal, Gumbel, Power Lognormal, Power Normal, Triangular, Tukey Lambda, Logistic, Hyperbolic Secant, Degenerate and General Continuous distributions.

Other additions include new statistical hypothesis tests such as Anderson-Daring and Shapiro-Wilk; as well as support for all of LIBLINEAR's support vector machine algorithms; and format reading support for MATLAB/Octave matrices, LibSVM models, sparse LibSVM data files, and many others.

For a complete list of changes, please see the full release notes at the release details page at:

Logo Lua MapReduce v0.3.6

by pakozm - November 15, 2014, 13:20:01 CET [ Project Homepage BibTeX Download ] 3827 views, 916 downloads, 3 subscriptions

About: Lua-MapReduce framework implemented in Lua using luamongo driver and MongoDB as storage. It follows Iterative MapReduce for training of Machine Learning statistical models.

  • Improved tuple implementation.

Logo BayesOpt, a Bayesian Optimization toolbox 0.7.2

by rmcantin - October 10, 2014, 19:12:59 CET [ Project Homepage BibTeX Download ] 15445 views, 3043 downloads, 4 subscriptions

About: BayesOpt is an efficient, C++ implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design and stochastic bandits. In the literature it is also called Sequential Kriging Optimization (SKO) or Efficient Global Optimization (EGO). There are also interfaces for C, Matlab/Octave and Python.


-Fixed bugs and doc typos

Logo Caffe 0.9999

by sergeyk - August 9, 2014, 01:57:58 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 7552 views, 1258 downloads, 2 subscriptions

About: Caffe aims to provide computer vision scientists with a clean, modifiable implementation of state-of-the-art deep learning algorithms. We believe that Caffe is the fastest available GPU CNN implementation. Caffe also provides seamless switching between CPU and GPU, which allows one to train models with fast GPUs and then deploy them on non-GPU clusters. Even in CPU mode, computing predictions on an image takes only 20 ms (in batch mode).


LOTS of stuff:

Logo ARTOS Adaptive Realtime Object Detection System 1.0

by erik - July 11, 2014, 22:02:34 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1977 views, 423 downloads, 2 subscriptions

About: ARTOS can be used to quickly learn models for visual object detection without having to collect a set of samples manually. To make this possible, it uses ImageNet, a large image database with more than 20,000 categories.


Initial Announcement on

Logo PyStruct 0.2

by t3kcit - July 9, 2014, 09:29:23 CET [ Project Homepage BibTeX Download ] 2947 views, 809 downloads, 1 subscription

About: PyStruct is a framework for learning structured prediction in Python. It has a modular interface, similar to the well-known SVMstruct. Apart from learning algorithms it also contains model formulations for popular CRFs and interfaces to many inference algorithm implementation.


Initial Announcement on

Logo Semi Stochastic Gradient Descent 1.0

by konkey - July 9, 2014, 04:28:47 CET [ BibTeX BibTeX for corresponding Paper Download ] 1935 views, 533 downloads, 1 subscription

About: Efficient implementation of Semi-Stochastic Gradient Descent algorithm (S2GD) for training logistic regression (L2-regularized).


Initial Announcement on

Logo IPCA v0.1

by kiraly - July 7, 2014, 10:25:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1833 views, 393 downloads, 1 subscription

About: This package implements Ideal PCA in MATLAB. Ideal PCA is a (cross-)kernel based feature extraction algorithm which is (a) a faster alternative to kernel PCA and (b) a method to learn data manifold certifying features.


Initial Announcement on

Logo Java deep neural networks with GPU 0.2.0-alpha

by hok - May 10, 2014, 14:22:30 CET [ Project Homepage BibTeX Download ] 2489 views, 587 downloads, 2 subscriptions

About: GPU-accelerated java deep neural networks


Initial Announcement on

Showing Items 1-20 of 53 on page 1 of 3: 1 2 3 Next