Projects running under macosx.
Showing Items 1-20 of 74 on page 1 of 4: 1 2 3 4 Next

Logo JMLR dlib ml 18.10

by davis685 - August 29, 2014, 02:56:23 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 78818 views, 13672 downloads, 2 subscriptions

About: This project is a C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real world problems.

Changes:

In addition to a number of usability improvements, this release adds an implementation of the recent paper "One Millisecond Face Alignment with an Ensemble of Regression Trees" by Vahid Kazemi and Josephine Sullivan. This includes tools for performing high quality face landmarking as well as tools for training new landmarking models. See the face_landmark_detection_ex.cpp and train_shape_predictor_ex.cpp example programs for an introduction.


Logo hca 0.6

by wbuntine - August 6, 2014, 14:24:57 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3651 views, 682 downloads, 3 subscriptions

About: Multi-core non-parametric and bursty topic models (HDP-LDA, DCMLDA, and other variants of LDA) implemented in C using efficient Gibbs sampling, with hyperparameter sampling and other flexible controls.

Changes:

Modified command line -A and -B formats. Overhaul of diagnostics. Described changes in manual. Bug fixes: multi-core crashing when huge number of topics; -B when using number and fitting beta, beta sampling wasn't working; both now fixed.


Logo JMLR Waffles 2014-07-05

by mgashler - July 20, 2014, 04:53:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 22949 views, 6888 downloads, 2 subscriptions

About: Script-friendly command-line tools for machine learning and data mining tasks. (The command-line tools wrap functionality from a public domain C++ class library.)

Changes:

Added support for CUDA GPU-parallelized neural network layers, and several other new features. Full list of changes at http://waffles.sourceforge.net/docs/changelog.html


Logo JMLR MSVMpack 1.5

by lauerfab - July 3, 2014, 16:02:49 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10870 views, 3721 downloads, 2 subscriptions

About: MSVMpack is a Multi-class Support Vector Machine (M-SVM) package. It is dedicated to SVMs which can handle more than two classes without relying on decomposition methods and implements the four M-SVM models from the literature: Weston and Watkins M-SVM, Crammer and Singer M-SVM, Lee, Lin and Wahba M-SVM, and the M-SVM2 of Guermeur and Monfrini.

Changes:
  • Windows binaries are now included (by Emmanuel Didiot)
  • MSVMpack can now be compiled on Windows (by Emmanuel Didiot)
  • Fixed polynomial kernel
  • Minor bug fixes

Logo FEAST 1.1.1

by apocock - June 30, 2014, 01:30:23 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 14466 views, 3420 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: FEAST provides implementations of common mutual information based filter feature selection algorithms (mim, mifs, mrmr, cmim, icap, jmi, disr, fcbf, etc), and an implementation of RELIEF. Written for C/C++ & Matlab.

Changes:
  • Bug fixes to memory management.
  • Compatibility changes for PyFeast python wrapper (note the C library now returns feature indices starting from 0, the Matlab wrapper still returns indices starting from 1).
  • Added C version of MIM.
  • Updated internal version of MIToolbox.

Logo MIToolbox 2.1

by apocock - June 30, 2014, 01:05:57 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12447 views, 2378 downloads, 1 subscription

About: A mutual information library for C and Mex bindings for MATLAB. Aimed at feature selection, and provides simple methods to calculate mutual information, conditional mutual information, entropy, conditional entropy, Renyi entropy/mutual information, and weighted variants of Shannon entropies/mutual informations. Works with discrete distributions, and expects column vectors of features.

Changes:

Added weighted entropy functions. Fixed a few memory handling bugs.


Logo OpenOpt 0.54

by Dmitrey - June 15, 2014, 14:50:37 CET [ Project Homepage BibTeX Download ] 40719 views, 8554 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 2 votes)

About: Universal Python-written numerical optimization toolbox. Problems: NLP, LP, QP, NSP, MILP, LSP, LLSP, MMP, GLP, SLE, MOP etc; general logical constraints, categorical variables, automatic differentiation, stochastic programming, interval analysis, many other goodies

Changes:

http://openopt.org/Changelog


Logo WEKA 3.7.11

by mhall - April 24, 2014, 10:13:12 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 38583 views, 5569 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 6 votes)

About: The Weka workbench contains a collection of visualization tools and algorithms for data analysis and predictive modelling, together with graphical user interfaces for easy access to this [...]

Changes:

In core weka:

  • Bagging and RandomForest are now faster if the base learner is a WeightedInstancesHandler
  • Speed-ups for REPTree and other classes that use entropy calculations
  • Many other code improvements and speed-ups
  • Additional statistics available in the output of LinearRegression and SimpleLinearRegression. Contributed by Chris Meyer
  • Reduced memory consumption in BayesNet
  • Improvements to the package manager: load status of individual packages can now be toggled to prevent a package from loading; "Available" button now displays the latest version of all available packages that are compatible with the base version of Weka
  • RandomizableFilteredClassifier
  • Canopy clusterer
  • ImageViewer KnowledgeFlow component
  • PMML export support for Logistic. Infrastructure and changes contributed by David Person
  • Extensive tool-tips now displayed in the Explorer's scheme selector tree lists
  • Join KnowledgeFlow component for performing an inner join on two incoming streams/data sets

In packages:

  • IWSSembeded package, contributed by Pablo Bermejo
  • CVAttributeEval package, contributed by Justin Liang
  • distributedWeka package for Hadoop
  • Improvements to multiLayerPerceptrons and addtion of MLPAutoencoder
  • Code clean-up in many packages

Logo JMLR Tapkee 1.0

by blackburn - April 10, 2014, 02:45:58 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 5474 views, 1528 downloads, 1 subscription

About: Tapkee is an efficient and flexible C++ template library for dimensionality reduction.

Changes:

Initial Announcement on mloss.org.


Logo JMLR MOA Massive Online Analysis Nov-13

by abifet - April 4, 2014, 03:50:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10896 views, 4321 downloads, 1 subscription

About: Massive Online Analysis (MOA) is a real time analytic tool for data streams. It is a software environment for implementing algorithms and running experiments for online learning from evolving data streams. MOA includes a collection of offline and online methods as well as tools for evaluation. In particular, it implements boosting, bagging, and Hoeffding Trees, all with and without Naive Bayes classifiers at the leaves. MOA supports bi-directional interaction with WEKA, the Waikato Environment for Knowledge Analysis, and it is released under the GNU GPL license.

Changes:

New version November 2013


Logo JMLR MultiBoost 1.2.02

by busarobi - March 31, 2014, 16:13:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 22203 views, 3932 downloads, 1 subscription

About: MultiBoost is a multi-purpose boosting package implemented in C++. It is based on the multi-class/multi-task AdaBoost.MH algorithm [Schapire-Singer, 1999]. Basic base learners (stumps, trees, products, Haar filters for image processing) can be easily complemented by new data representations and the corresponding base learners, without interfering with the main boosting engine.

Changes:

Major changes :

  • The “early stopping” feature can now based on any metric output with the --outputinfo command line argument.

  • Early stopping now works with --slowresume command line argument.

Minor fixes:

  • More informative output when testing.

  • Various compilation glitch with recent clang (OsX/Linux).


Logo JMLR SHOGUN 3.2.0

by sonne - February 17, 2014, 20:31:36 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 82925 views, 11468 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarEmpty StarEmpty Star
(based on 6 votes)

About: The SHOGUN machine learning toolbox's focus is on large scale learning methods with focus on Support Vector Machines (SVM), providing interfaces to python, octave, matlab, r and the command line.

Changes:

This is mostly a bugfix release:

Features

  • Fully support python3 now
  • Add mini-batch k-means [Parijat Mazumdar]
  • Add k-means++ [Parijat Mazumdar]
  • Add sub-sequence string kernel [lambday]

Bugfixes

  • Compile fixes for upcoming swig3.0
  • Speedup for gaussian process' apply()
  • Improve unit / integration test checks
  • libbmrm uninitialized memory reads
  • libocas uninitialized memory reads
  • Octave 3.8 compile fixes [Orion Poplawski]
  • Fix java modular compile error [Bjoern Esser]

About: The CTBN-RLE is a C++ package of executables and libraries for inference and learning algorithms for continuous time Bayesian networks (CTBNs).

Changes:

compilation problems fixed


Logo Theano 0.6

by jaberg - December 3, 2013, 20:32:02 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12587 views, 2353 downloads, 1 subscription

About: A Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. Dynamically generates CPU and GPU modules for good performance. Deep Learning Tutorials illustrate deep learning with Theano.

Changes:

Theano 0.6 (December 3th, 2013)

Highlight:

* Last release with support for Python 2.4 and 2.5.
* We will try to release more frequently.
* Fix crash/installation problems.
* Use less memory for conv3d2d.

0.6rc4 skipped for a technical reason.

Highlights (since 0.6rc3):

* Python 3.3 compatibility with buildbot test for it.
* Full advanced indexing support.
* Better Windows 64 bit support.
* New profiler.
* Better error messages that help debugging.
* Better support for newer NumPy versions (remove useless warning/crash).
* Faster optimization/compilation for big graph.
* Move in Theano the Conv3d2d implementation.
* Better SymPy/Theano bridge: Make an Theano op from SymPy expression and use SymPy c code generator.
* Bug fixes.

Too much changes in 0.6rc1, 0.6rc2 and 0.6rc3 to list here. See https://github.com/Theano/Theano/blob/master/NEWS.txt for details.


Logo GBAC 0.0.4

by henrydcl - November 22, 2013, 20:04:16 CET [ BibTeX BibTeX for corresponding Paper Download ] 2091 views, 708 downloads, 2 subscriptions

About: Probabilistic performance evaluation for multiclass classification using the posterior balanced accuracy

Changes:

Added bibtex information.


Logo JMLR CARP 3.3

by volmeln - November 7, 2013, 15:48:06 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 13811 views, 4427 downloads, 1 subscription

About: CARP: The Clustering Algorithms’ Referee Package

Changes:

Generalized overlap error and some bugs have been fixed


Logo bob 1.2.2

by anjos - October 28, 2013, 14:37:36 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 4800 views, 998 downloads, 1 subscription

About: Bob is a free signal-processing and machine learning toolbox originally developed by the Biometrics group at Idiap Research Institute, in Switzerland.

Changes:

Bob 1.2.0 comes about 1 year after we released Bob 1.0.0. This new release comes with a big set of new features and lots of changes under the hood to make your experiments run even smoother. Some statistics:

Diff URL: https://github.com/idiap/bob/compare/v1.1.4...HEAD Commits: 629 Files changed: 954 Contributors: 7

Here is a quick list of things you should pay attention for while integrating your satellite packages against Bob 1.2.x:

  • The LBP module had its API changed look at the online docs for more details
  • LLRTrainer has been renamed to CGLogRegTrainer
  • The order in which you pass data to CGLogRegTrainer has been inverted (negatives now go first)
  • For C++ bindings, includes are in bob/python instead of bob/core/python
  • All specialized Bob exceptions are gone, if you were catching them, most have been cast into std::runtime_error's

For a detailed list of changes and additions, please look at our Changelog page for this release and minor updates:

https://github.com/idiap/bob/wiki/Changelog-from-1.1.4-to-1.2 https://github.com/idiap/bob/wiki/Changelog-from-1.2.0-to-1.2.1 https://github.com/idiap/bob/wiki/Changelog-from-1.2.1-to-1.2.2


Logo JMLR scikitlearn 0.14.1

by fabianp - October 4, 2013, 15:01:45 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12212 views, 4285 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 3 votes)

About: The scikit-learn project is a machine learning library in Python.

Changes:

Update for 0.14.1


Logo JMLR Jstacs 2.1

by keili - June 3, 2013, 07:32:55 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 13696 views, 3228 downloads, 2 subscriptions

About: A Java framework for statistical analysis and classification of biological sequences

Changes:

New classes:

  • MultipleIterationsCondition: Requires another TerminationCondition to fail a contiguous, specified number of times
  • ClassifierFactory: Allows for creating standard classifiers
  • SeqLogoPlotter: Plot PNG sequence logos from within Jstacs
  • MultivariateGaussianEmission: Multivariate Gaussian emission density for a Hidden Markov Model
  • MEManager: Maximum entropy model

New features and improvements:

  • Alignment: Added free shift alignment
  • PerformanceMeasure and sub-classes: Extension to weighted test data
  • AbstractClassifier, ClassifierAssessment and sub-classes: Adaption to weighted PerformanceMeasures
  • DNAAlphabet: Parser speed-up
  • PFMComparator: Extension to PFM from other sources/databases
  • ToolBox: New convenience methods for computing several statistics (e.g., median, correlation)
  • SignificantMotifOccurrencesFinder: New methods for computing PWMs and statistics from predictions
  • SequenceScore and sub-classes: New method toString(NumberFormat)
  • DataSet: Adaption to weighted data, e.g., partitioning
  • REnvironment: Changed several methods from String to CharSequence

Restructuring:

  • changed MultiDimensionalSequenceWrapperDiffSM to MultiDimensionalSequenceWrapperDiffSS

Several minor new features, bug fixes, and code cleanups


Logo MLDemos 0.5.1

by basilio - March 2, 2013, 16:06:13 CET [ Project Homepage BibTeX Download ] 17668 views, 4207 downloads, 2 subscriptions

About: MLDemos is a user-friendly visualization interface for various machine learning algorithms for classification, regression, clustering, projection, dynamical systems, reward maximisation and reinforcement learning.

Changes:

New Visualization and Dataset Features Added 3D visualization of samples and classification, regression and maximization results Added Visualization panel with individual plots, correlations, density, etc. Added Editing tools to drag/magnet data, change class, increase or decrease dimensions of the dataset Added categorical dimensions (indexed dimensions with non-numerical values) Added Dataset Editing panel to swap, delete and rename dimensions, classes or categorical values Several bug-fixes for display, import/export of data, classification performance

New Algorithms and methodologies Added Projections to pre-process data (which can then be classified/regressed/clustered), with LDA, PCA, KernelPCA, ICA, CCA Added Grid-Search panel for batch-testing ranges of values for up to two parameters at a time Added One-vs-All multi-class classification for non-multi-class algorithms Trained models can now be kept and tested on new data (training on one dataset, testing on another) Added a dataset generator panel for standard toy datasets (e.g. swissroll, checkerboard,...) Added a number of clustering, regression and classification algorithms (FLAME, DBSCAN, LOWESS, CCA, KMEANS++, GP Classification, Random Forests) Added Save/Load Model option for GMMs and SVMs Added Growing Hierarchical Self Organizing Maps (original code by Michael Dittenbach) Added Automatic Relevance Determination for SVM with RBF kernel (Thanks to Ashwini Shukla!)


Showing Items 1-20 of 74 on page 1 of 4: 1 2 3 4 Next