20 projects found that use the bsd license.
Showing Items 1-20 of 77 on page 1 of 4: 1 2 3 4 Next

Logo JMLR MLPACK 2.1.0

by rcurtin - November 1, 2016, 16:01:16 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 69401 views, 12332 downloads, 6 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 1 vote)

About: A scalable, fast C++ machine learning library, with emphasis on usability.

Changes:

Fixed CoverTree to properly handle single-point datasets. - Fixed a bug in CosineTree (and thus QUIC-SVD) that caused split failures for some datasets (#717). - Added mlpack_preprocess_describe program, which can be used to print statistics on a given dataset (#742). - Fix prioritized recursion for k-furthest-neighbor search (mlpack_kfn and the KFN class), leading to orders-of-magnitude speedups in some cases. - Bump minimum required version of Armadillo to 4.200.0. - Added simple Gradient Descent optimizer, found in src/mlpack/core/optimizers/gradient_descent/ (#792). - Added approximate furthest neighbor search algorithms QDAFN and DrusillaSelect in src/mlpack/methods/approx_kfn/, with command-line program mlpack_approx_kfn.


Logo scikit multilearn 0.0.3

by niedakh - June 15, 2016, 19:28:32 CET [ Project Homepage BibTeX Download ] 1100 views, 274 downloads, 2 subscriptions

About: A native Python, scikit-compatible, implementation of a variety of multi-label classification algorithms.

Changes:

Initial Announcement on mloss.org.


Logo QMiner 5.0.0

by blazfortuna - April 8, 2016, 14:17:58 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2663 views, 475 downloads, 2 subscriptions

About: Analytic engine for real-time large-scale streams containing structured and unstructured data.

Changes:

Initial Announcement on mloss.org.


Logo FEAST 1.1.4

by apocock - March 12, 2016, 18:35:08 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 39341 views, 7108 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: FEAST provides implementations of common mutual information based filter feature selection algorithms (mim, mifs, mrmr, cmim, icap, jmi, disr, fcbf, etc), and an implementation of RELIEF. Written for C/C++ & Matlab.

Changes:
  • Fixed an issue where zero MI values would cause it to segfault.
  • Fixes to documentation and comments.
  • Updated internal version of MIToolbox.

Logo Cognitive Foundry 3.4.2

by Baz - October 30, 2015, 06:53:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 30395 views, 5128 downloads, 4 subscriptions

About: The Cognitive Foundry is a modular Java software library of machine learning components and algorithms designed for research and applications.

Changes:
  • General:
    • Upgraded MTJ to 1.0.3.
  • Common:
    • Added package for hash function computation including Eva, FNV-1a, MD5, Murmur2, Prime, SHA1, SHA2
    • Added callback-based forEach implementations to Vector and InfiniteVector, which can be faster for iterating through some vector types.
    • Optimized DenseVector by removing a layer of indirection.
    • Added method to compute set of percentiles in UnivariateStatisticsUtil and fixed issue with percentile interpolation.
    • Added utility class for enumerating combinations.
    • Adjusted ScalarMap implementation hierarchy.
    • Added method for copying a map to VectorFactory and moved createVectorCapacity up from SparseVectorFactory.
    • Added method for creating square identity matrix to MatrixFactory.
    • Added Random implementation that uses a cached set of values.
  • Learning:
    • Implemented feature hashing.
    • Added factory for random forests.
    • Implemented uniform distribution over integer values.
    • Added Chi-squared similarity.
    • Added KL divergence.
    • Added general conditional probability distribution.
    • Added interfaces for Regression, UnivariateRegression, and MultivariateRegression.
    • Fixed null pointer exception that can happen in K-means with an empty cluster.
    • Fixed name of maxClusters property on AgglomerativeClusterer (was called maxMinDistance).
  • Text:
    • Improvements to LDA Gibbs sampler.

Logo Optunity 1.1.1

by claesenm - September 30, 2015, 07:06:17 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 7859 views, 1816 downloads, 3 subscriptions

About: Optunity is a library containing various optimizers for hyperparameter tuning. Hyperparameter tuning is a recurrent problem in many machine learning tasks, both supervised and unsupervised.This package provides several distinct approaches to solve such problems including some helpful facilities such as cross-validation and a plethora of score functions.

Changes:

This minor release has the same feature set as Optunity 1.1.0, but incorporates several bug fixes, mostly related to the specification of structured search spaces.


Logo JMLR Darwin 1.9

by sgould - September 8, 2015, 06:50:37 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 55544 views, 11475 downloads, 4 subscriptions

About: A platform-independent C++ framework for machine learning, graphical models, and computer vision research and development.

Changes:

Version 1.9:

  • Replaced drwnInPaint class with drwnImageInPainter class and added inPaint application
  • Added function to read CIFAR-10 and CIFAR-100 style datasets (see http://www.cs.utoronto.ca/~kriz/cifar.html)
  • Added drwnMaskedPatchMatch, drwnBasicPatchMatch, drwnSelfPatchMatch and basicPatchMatch application
  • drwnPatchMatchGraph now allows multiple matches to the same image
  • Upgraded wxWidgets to 3.0.2 (problems on Mac OS X)
  • Switched Mac OS X compilation to libc++ instead of libstdc++
  • Added Python scripts for running experiments and regression tests
  • Refactored drwnGrabCutInstance class to support both GMM and colour histogram model
  • Added cacheSortIndex to drwnDecisionTree for trading-off speed versus memory usage
  • Added mexLoadPatchMatchGraph for loading drwnPatchMatchGraph objects into Matlab
  • Improved documentation, other bug fixes and performance improvements

Logo LMW Tree 1.0

by cdevries - May 30, 2015, 11:42:23 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2446 views, 472 downloads, 2 subscriptions

About: Learning M-Way Tree - Web Scale Clustering - EM-tree, K-tree, k-means, TSVQ, repeated k-means, clustering, random projections, random indexing, hashing, bit signatures

Changes:

Initial Announcement on mloss.org.


Logo ClusterEval 1.1

by cdevries - May 18, 2015, 22:01:01 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 5718 views, 1330 downloads, 2 subscriptions

About: Cluster quality Evaluation software. Implements cluster quality metrics based on ground truths such as Purity, Entropy, Negentropy, F1 and NMI. It includes a novel approach to correct for pathological or ineffective clusterings called 'Divergence from a Random Baseline'.

Changes:

Moved project to GitHub.


Logo BLOG 0.9.1

by jxwuyi - April 27, 2015, 06:52:05 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2682 views, 599 downloads, 3 subscriptions

About: Bayesian Logic (BLOG) is a probabilistic modeling language. It is designed for representing relations and uncertainties among real world objects.

Changes:

Initial Announcement on mloss.org.


Logo java machine learning platform 1.0

by openpr_nlpr - April 2, 2015, 09:02:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2522 views, 461 downloads, 2 subscriptions

About: Jmlp is a java platform for both of the machine learning experiments and application. I have tested it on the window platform. But it should be applicable in the linux platform due to the cross-platform of Java language. It contains the classical classification algorithm (Discrete AdaBoost.MH, Real AdaBoost.MH, SVM, KNN, MCE,MLP,NB) and feature reduction(KPCA,PCA,Whiten) etc.

Changes:

Initial Announcement on mloss.org.


Logo pyGPs 1.3.2

by mn - January 17, 2015, 13:08:43 CET [ Project Homepage BibTeX Download ] 8572 views, 1963 downloads, 4 subscriptions

About: pyGPs is a Python package for Gaussian process (GP) regression and classification for machine learning.

Changes:

Changelog pyGPs v1.3.2

December 15th 2014

  • pyGPs added to pip
  • mathematical definitions of kernel functions available in documentation
  • more error message added

Logo Lynx MATLAB Toolbox v0.8-beta

by ispamm - November 19, 2014, 00:56:07 CET [ Project Homepage BibTeX Download ] 2144 views, 546 downloads, 1 subscription

About: A MATLAB toolbox for defining complex machine learning comparisons

Changes:

Initial Announcement on mloss.org.


Logo BACOM2 1.0

by fydennis - October 24, 2014, 15:25:38 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2414 views, 588 downloads, 2 subscriptions

About: revised version of BACOM

Changes:

Initial Announcement on mloss.org.


Logo RLPy 1.3a

by bobklein2 - August 28, 2014, 14:34:35 CET [ Project Homepage BibTeX Download ] 5722 views, 1216 downloads, 1 subscription

About: RLPy is a framework for performing reinforcement learning (RL) experiments in Python. RLPy provides a large library of agent and domain components, and a suite of tools to aid in experiments (parallelization, hyperparameter optimization, code profiling, and plotting).

Changes:
  • Fixed bug where results using same random seed were different with visualization turned on/off
  • Created RLPy package on pypi (Available at https://pypi.python.org/pypi/rlpy)
  • Switched from custom logger class to python default
  • Added unit tests
  • Code readability improvements (formatting, variable names/ordering)
  • Restructured TD Learning heirarchy
  • Updated tutorials

Logo Caffe 0.9999

by sergeyk - August 9, 2014, 01:57:58 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12117 views, 1956 downloads, 2 subscriptions

About: Caffe aims to provide computer vision scientists with a clean, modifiable implementation of state-of-the-art deep learning algorithms. We believe that Caffe is the fastest available GPU CNN implementation. Caffe also provides seamless switching between CPU and GPU, which allows one to train models with fast GPUs and then deploy them on non-GPU clusters. Even in CPU mode, computing predictions on an image takes only 20 ms (in batch mode).

Changes:

LOTS of stuff: https://github.com/BVLC/caffe/releases/tag/v0.9999


Logo PyStruct 0.2

by t3kcit - July 9, 2014, 09:29:23 CET [ Project Homepage BibTeX Download ] 4537 views, 1136 downloads, 1 subscription

About: PyStruct is a framework for learning structured prediction in Python. It has a modular interface, similar to the well-known SVMstruct. Apart from learning algorithms it also contains model formulations for popular CRFs and interfaces to many inference algorithm implementation.

Changes:

Initial Announcement on mloss.org.


Logo OpenOpt 0.54

by Dmitrey - June 15, 2014, 14:50:37 CET [ Project Homepage BibTeX Download ] 68144 views, 14057 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 2 votes)

About: Universal Python-written numerical optimization toolbox. Problems: NLP, LP, QP, NSP, MILP, LSP, LLSP, MMP, GLP, SLE, MOP etc; general logical constraints, categorical variables, automatic differentiation, stochastic programming, interval analysis, many other goodies

Changes:

http://openopt.org/Changelog


Logo JMLR Tapkee 1.0

by blackburn - April 10, 2014, 02:45:58 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 13272 views, 3764 downloads, 1 subscription

About: Tapkee is an efficient and flexible C++ template library for dimensionality reduction.

Changes:

Initial Announcement on mloss.org.


Logo DRVQ 1.0.1-beta

by iavr - January 18, 2014, 17:26:34 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3115 views, 683 downloads, 1 subscription

About: DRVQ is a C++ library implementation of dimensionality-recursive vector quantization, a fast vector quantization method in high-dimensional Euclidean spaces under arbitrary data distributions. It is an approximation of k-means that is practically constant in data size and applies to arbitrarily high dimensions but can only scale to a few thousands of centroids. As a by-product of training, a tree structure performs either exact or approximate quantization on trained centroids, the latter being not very precise but extremely fast.

Changes:

Initial Announcement on mloss.org.


Showing Items 1-20 of 77 on page 1 of 4: 1 2 3 4 Next