Projects running under linux.
Showing Items 41-60 of 239 on page 3 of 12: Previous 1 2 3 4 5 6 7 8 Next Last

Logo FEAST 1.1.1

by apocock - June 30, 2014, 01:30:23 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 19395 views, 4301 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: FEAST provides implementations of common mutual information based filter feature selection algorithms (mim, mifs, mrmr, cmim, icap, jmi, disr, fcbf, etc), and an implementation of RELIEF. Written for C/C++ & Matlab.

Changes:
  • Bug fixes to memory management.
  • Compatibility changes for PyFeast python wrapper (note the C library now returns feature indices starting from 0, the Matlab wrapper still returns indices starting from 1).
  • Added C version of MIM.
  • Updated internal version of MIToolbox.

Logo MIToolbox 2.1

by apocock - June 30, 2014, 01:05:57 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 15502 views, 2873 downloads, 1 subscription

About: A mutual information library for C and Mex bindings for MATLAB. Aimed at feature selection, and provides simple methods to calculate mutual information, conditional mutual information, entropy, conditional entropy, Renyi entropy/mutual information, and weighted variants of Shannon entropies/mutual informations. Works with discrete distributions, and expects column vectors of features.

Changes:

Added weighted entropy functions. Fixed a few memory handling bugs.


Logo OpenOpt 0.54

by Dmitrey - June 15, 2014, 14:50:37 CET [ Project Homepage BibTeX Download ] 45732 views, 9576 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 2 votes)

About: Universal Python-written numerical optimization toolbox. Problems: NLP, LP, QP, NSP, MILP, LSP, LLSP, MMP, GLP, SLE, MOP etc; general logical constraints, categorical variables, automatic differentiation, stochastic programming, interval analysis, many other goodies

Changes:

http://openopt.org/Changelog


Logo A Pattern Recognizer In Lua with ANNs v0.3.1

by pakozm - May 30, 2014, 10:49:10 CET [ Project Homepage BibTeX Download ] 3442 views, 810 downloads, 2 subscriptions

About: APRIL-ANN toolkit (A Pattern Recognizer In Lua with Artificial Neural Networks). This toolkit incorporates ANN algorithms (as dropout, stacked denoising auto-encoders, convolutional neural networks), with other pattern recognition methods as hidden makov models (HMMs) among others.

Changes:
  • Removed bugs.
  • Added Travis CI support.
  • KNN and clustering algorithms.
  • ZCA and PCA whitening.
  • Quickprop and ASGD optimization algorithms.
  • QLearning trainer.
  • Sparse float matrices are available in CSC an CSR formats.
  • Compilation with Homebrew and MacPorts available.
  • Compilation issues in Ubuntu 12.04 solved.

Logo Weight HMM 1.0

by SongTao - May 27, 2014, 15:29:20 CET [ BibTeX Download ] 615 views, 224 downloads, 1 subscription

About: Discovering short linear protein motif based on selective training of profile hidden Markov models

Changes:

Initial Announcement on mloss.org.


Logo Mr. 1.0

by SongTao - May 27, 2014, 15:20:40 CET [ BibTeX Download ] 525 views, 221 downloads, 1 subscription

About: Discovering short linear protein motif based on selective training of profile hidden Markov models

Changes:

Initial Announcement on mloss.org.


Logo Java deep neural networks with GPU 0.2.0-alpha

by hok - May 10, 2014, 14:22:30 CET [ Project Homepage BibTeX Download ] 1730 views, 393 downloads, 2 subscriptions

About: GPU-accelerated java deep neural networks

Changes:

Initial Announcement on mloss.org.


Logo PredictionIO 0.7.0

by simonc - April 29, 2014, 20:59:57 CET [ Project Homepage BibTeX Download ] 7017 views, 1429 downloads, 2 subscriptions

About: Open Source Machine Learning Server

Changes:
  • Single machine version for small-to-medium scale deployments
  • Integrated GraphChi (disk-based large-scale graph computation) and algorithms: ALS, CCD++, SGD, CLiMF
  • Improved runtime for training and offline evaluation
  • Bug fixes

See release notes - https://predictionio.atlassian.net/secure/ReleaseNote.jspa?projectId=10000&version=11801


Logo RFD 1.0

by openpr_nlpr - April 28, 2014, 10:34:57 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1280 views, 313 downloads, 1 subscription

About: This is an unoptimized implementation of the RFD binary descriptor, which is published in the following paper. B. Fan, et al. Receptive Fields Selection for Binary Feature Description. IEEE Transaction on Image Processing, 2014. doi: http://dx.doi.org/10.1109/TIP.2014.2317981

Changes:

Initial Announcement on mloss.org.


About: RLLib is a lightweight C++ template library that implements incremental, standard, and gradient temporal-difference learning algorithms in Reinforcement Learning. It is an optimized library for robotic applications and embedded devices that operates under fast duty cycles (e.g., < 30 ms). RLLib has been tested and evaluated on RoboCup 3D soccer simulation agents, physical NAO V4 humanoid robots, and Tiva C series launchpad microcontrollers to predict, control, learn behaviors, and represent learnable knowledge. The implementation of the RLLib library is inspired by the RLPark API, which is a library of temporal-difference learning algorithms written in Java.

Changes:

Current release version is v2.0.


Logo libstb 1.8

by wbuntine - April 24, 2014, 09:02:17 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 5910 views, 1161 downloads, 1 subscription

About: Generalised Stirling Numbers for Pitman-Yor Processes: this library provides ways of computing generalised 2nd-order Stirling numbers for Pitman-Yor and Dirichlet processes. Included is a tester and parameter optimiser. This accompanies Buntine and Hutter's article: http://arxiv.org/abs/1007.0296, and a series of papers by Buntine and students at NICTA and ANU.

Changes:

Moved repository to GitHub, and added thread support to use the main table lookups in multi-threaded code.


Logo GradMC 2.00

by tur - April 14, 2014, 15:48:48 CET [ BibTeX Download ] 2007 views, 671 downloads, 1 subscription

About: GradMC is an algorithm for MR motion artifact removal implemented in Matlab

Changes:

Added support for multi-rigid motion correction.


Logo MShadow 1.0

by antinucleon - April 10, 2014, 02:57:54 CET [ Project Homepage BibTeX Download ] 1061 views, 303 downloads, 1 subscription

About: Lightweight CPU/GPU Matrix/Tensor Template Library in C++/CUDA. Support element-wise expression expand in high performance. Code once, run smoothly on both GPU and CPU

Changes:

Initial Announcement on mloss.org.


Logo CXXNET 0.1

by antinucleon - April 10, 2014, 02:47:08 CET [ Project Homepage BibTeX Download ] 1298 views, 332 downloads, 1 subscription

About: CXXNET (spelled as: C plus plus net) is a neural network toolkit build on mshadow(https://github.com/tqchen/mshadow). It is yet another implementation of (convolutional) neural network. It is in C++, with about 1000 lines of network layer implementations, easily configuration via config file, and can get the state of art performance.

Changes:

Initial Announcement on mloss.org.


Logo JMLR Tapkee 1.0

by blackburn - April 10, 2014, 02:45:58 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 7246 views, 2109 downloads, 1 subscription

About: Tapkee is an efficient and flexible C++ template library for dimensionality reduction.

Changes:

Initial Announcement on mloss.org.


Logo JMLR MOA Massive Online Analysis Nov-13

by abifet - April 4, 2014, 03:50:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12531 views, 5002 downloads, 1 subscription

About: Massive Online Analysis (MOA) is a real time analytic tool for data streams. It is a software environment for implementing algorithms and running experiments for online learning from evolving data streams. MOA includes a collection of offline and online methods as well as tools for evaluation. In particular, it implements boosting, bagging, and Hoeffding Trees, all with and without Naive Bayes classifiers at the leaves. MOA supports bi-directional interaction with WEKA, the Waikato Environment for Knowledge Analysis, and it is released under the GNU GPL license.

Changes:

New version November 2013


Logo JMLR MultiBoost 1.2.02

by busarobi - March 31, 2014, 16:13:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 26138 views, 4543 downloads, 1 subscription

About: MultiBoost is a multi-purpose boosting package implemented in C++. It is based on the multi-class/multi-task AdaBoost.MH algorithm [Schapire-Singer, 1999]. Basic base learners (stumps, trees, products, Haar filters for image processing) can be easily complemented by new data representations and the corresponding base learners, without interfering with the main boosting engine.

Changes:

Major changes :

  • The “early stopping” feature can now based on any metric output with the --outputinfo command line argument.

  • Early stopping now works with --slowresume command line argument.

Minor fixes:

  • More informative output when testing.

  • Various compilation glitch with recent clang (OsX/Linux).


Logo JMLR EnsembleSVM 2.0

by claesenm - March 31, 2014, 08:06:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6482 views, 2339 downloads, 2 subscriptions

About: The EnsembleSVM library offers functionality to perform ensemble learning using Support Vector Machine (SVM) base models. In particular, we offer routines for binary ensemble models using SVM base classifiers. Experimental results have shown the predictive performance to be comparable with standard SVM models but with drastically reduced training time. Ensemble learning with SVM models is particularly useful for semi-supervised tasks.

Changes:

The library has been updated and features a variety of new functionality as well as more efficient implementations of original features. The following key improvements have been made:

  1. Support for multithreading in training and prediction with ensemble models. Since both of these are embarassingly parallel, this has induced a significant speedup (3-fold on quad-core).
  2. Extensive programming framework for aggregation of base model predictions which allows highly efficient prototyping of new aggregation approaches. Additionally we provide several predefined strategies, including (weighted) majority voting, logistic regression and nonlinear SVMs of your choice -- be sure to check out the esvm-edit tool! The provided framework also allows you to efficiently program your own, novel aggregation schemes.
  3. Full code transition to C++11, the latest C++ standard, which enabled various performance improvements. The new release requires moderately recent compilers, such as gcc 4.7.2+ or clang 3.2+.
  4. Generic implementations of convenient facilities have been added, such as thread pools, deserialization factories and more.

The API and ABI have undergone significant changes, many of which are due to the transition to C++11.


Logo Libra 1.0.1

by lowd - March 30, 2014, 09:42:00 CET [ Project Homepage BibTeX Download ] 10898 views, 2404 downloads, 1 subscription

About: The Libra Toolkit is a collection of algorithms for learning and inference with discrete probabilistic models, including Bayesian networks, Markov networks, dependency networks, sum-product networks, arithmetic circuits, and mixtures of trees.

Changes:

Version 1.0.1 (3/30/2014):

  • Several new algorithms -- acmn, learning ACs using MNs; idspn, SPN structure learning; mtlearn, learning mixtures of trees
  • Several new support programs -- spquery, for exact inference in SPNs; spn2ac, for converting SPNs to ACs
  • Renamed aclearnstruct to acbn
  • Replaced aclearnstruct -noac with separate bnlearn program
  • ...and many more small changes and fixes, throughout!

Logo JMLR fastclime 1.2.3

by colin1898 - March 10, 2014, 08:54:41 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2180 views, 587 downloads, 1 subscription

About: The package "fastclime" provides a method of recover the precision matrix efficiently by applying parametric simplex method. The computation is based on a linear optimization solver. It also contains a generic LP solver and a parameterized LP solver using parametric simplex method.

Changes:

Initial Announcement on mloss.org.


Showing Items 41-60 of 239 on page 3 of 12: Previous 1 2 3 4 5 6 7 8 Next Last