Projects that are tagged with classification.
Showing Items 1-20 of 67 on page 1 of 4: 1 2 3 4 Next

Logo JMLR GPstuff 4.5

by avehtari - July 22, 2014, 14:03:11 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10604 views, 2896 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2014-07-22 Version 4.5

New features

  • Input dependent noise and signal variance.

    • Tolvanen, V., Jylänki, P. and Vehtari, A. (2014). Expectation Propagation for Nonstationary Heteroscedastic Gaussian Process Regression. In Proceedings of IEEE International Workshop on Machine Learning for Signal Processing, accepted for publication. Preprint http://arxiv.org/abs/1404.5443
  • Sparse stochastic variational inference model.

    • Hensman, J., Fusi, N. and Lawrence, N. D. (2013). Gaussian processes for big data. arXiv preprint http://arxiv.org/abs/1309.6835.
  • Option 'autoscale' in the gp_rnd.m to get split normal approximated samples from the posterior predictive distribution of the latent variable.

    • Geweke, J. (1989). Bayesian Inference in Econometric Models Using Monte Carlo Integration. Econometrica, 57(6):1317-1339.

    • Villani, M. and Larsson, R. (2006). The Multivariate Split Normal Distribution and Asymmetric Principal Components Analysis. Communications in Statistics - Theory and Methods, 35(6):1123-1140.

Improvements

  • New unit test environment using the Matlab built-in test framework (the old Xunit package is still also supported).
  • Precomputed demo results (including the figures) are now available in the folder tests/realValues.
  • New demos demonstrating new features etc.
    • demo_epinf, demonstrating the input dependent noise and signal variance model
    • demo_svi_regression, demo_svi_classification
    • demo_modelcomparison2, demo_survival_comparison

Several minor bugfixes


Logo JMLR Waffles 2014-07-05

by mgashler - July 20, 2014, 04:53:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 21934 views, 6676 downloads, 2 subscriptions

About: Script-friendly command-line tools for machine learning and data mining tasks. (The command-line tools wrap functionality from a public domain C++ class library.)

Changes:

Added support for CUDA GPU-parallelized neural network layers, and several other new features. Full list of changes at http://waffles.sourceforge.net/docs/changelog.html


Logo pyGPs 1.2

by mn - July 17, 2014, 10:28:55 CET [ Project Homepage BibTeX Download ] 1429 views, 353 downloads, 2 subscriptions

About: pyGPs is a Python package for Gaussian process (GP) regression and classification for machine learning.

Changes:

Changelog pyGPs v1.2

June 30th 2014

structural updates:

  • input target now can either be in 2-d array with size (n,1) or in 1-d array with size (n,)
  • setup.py updated
  • "import pyGPs" instead of "from pyGPs.Core import gp"
  • rename ".train()" to ".optimize()"
  • rename "Graph-stuff" to "graphExtension"
  • rename kernelOnGraph to "nodeKernels" and graphKernel to "graphKernels"
  • redundancy removed for model.setData(x,y)
  • rewrite "mean.proceed()" to "getMean()" and "getDerMatrix()"
  • rewrite "cov.proceed()" to "getCovMatrix()" and "getDerMatrix()"
  • rename cov.LIN to cov.Linear (to be consistent with mean.Linear)
  • rename module "valid" to "validation"
  • add graph dataset Mutag in python file. (.npz and .mat)
  • add graphUtil.nomalizeKernel()
  • fix number of iteration problem in graphKernels "PropagationKernel"
  • add unit testing for covariance, mean functions

bug fixes:

  • derivatives for cov.LINard
  • derivative of the scalar for cov.covScale
  • demo_GPR_FITC.py missing pyGPs.mean

July 8th 2014

structural updates:

  • add hyperparameter(signal variance s2) for linear covariance
  • add unit testing for inference,likelihood functions as well as models
  • NOT show(print) "maximum number of sweep warning in inference EP" any more
  • documentation updated

bug fixes:

  • typos in lik.Laplace
  • derivative in lik.Laplace

July 14th 2014

documentation updates:

  • online docs updated
  • API file updated

structural updates:

  • made private for methods that users don't need to call

Logo RankSVM NC 1.0

by rflamary - July 10, 2014, 15:51:21 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 314 views, 60 downloads, 1 subscription

About: This package is an implementation of a linear RankSVM solver with non-convex regularization.

Changes:

Initial Announcement on mloss.org.


Logo JMLR dlib ml 18.9

by davis685 - June 17, 2014, 01:05:32 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 75849 views, 13226 downloads, 2 subscriptions

About: This project is a C++ toolkit containing machine learning algorithms and tools that facilitate creating complex software in C++ to solve real world problems.

Changes:

Fixed a bug in the way file serialization was being handled on MS Windows platforms.


Logo Boosted Decision Trees and Lists 1.0.3

by melamed - May 1, 2014, 15:19:29 CET [ BibTeX Download ] 1920 views, 606 downloads, 2 subscriptions

About: Boosting algorithms for classification and regression, with many variations. Features include: Scalable and robust; Easily customizable loss functions; One-shot training for an entire regularization path; Continuous checkpointing; much more

Changes:
  • faster warm-start

  • made it easier to add more library paths to local makefile

  • added scripts to remove rare features and to standardize features


Logo WEKA 3.7.11

by mhall - April 24, 2014, 10:13:12 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 37630 views, 5376 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 6 votes)

About: The Weka workbench contains a collection of visualization tools and algorithms for data analysis and predictive modelling, together with graphical user interfaces for easy access to this [...]

Changes:

In core weka:

  • Bagging and RandomForest are now faster if the base learner is a WeightedInstancesHandler
  • Speed-ups for REPTree and other classes that use entropy calculations
  • Many other code improvements and speed-ups
  • Additional statistics available in the output of LinearRegression and SimpleLinearRegression. Contributed by Chris Meyer
  • Reduced memory consumption in BayesNet
  • Improvements to the package manager: load status of individual packages can now be toggled to prevent a package from loading; "Available" button now displays the latest version of all available packages that are compatible with the base version of Weka
  • RandomizableFilteredClassifier
  • Canopy clusterer
  • ImageViewer KnowledgeFlow component
  • PMML export support for Logistic. Infrastructure and changes contributed by David Person
  • Extensive tool-tips now displayed in the Explorer's scheme selector tree lists
  • Join KnowledgeFlow component for performing an inner join on two incoming streams/data sets

In packages:

  • IWSSembeded package, contributed by Pablo Bermejo
  • CVAttributeEval package, contributed by Justin Liang
  • distributedWeka package for Hadoop
  • Improvements to multiLayerPerceptrons and addtion of MLPAutoencoder
  • Code clean-up in many packages

Logo JMLR MOA Massive Online Analysis Nov-13

by abifet - April 4, 2014, 03:50:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10516 views, 4164 downloads, 1 subscription

About: Massive Online Analysis (MOA) is a real time analytic tool for data streams. It is a software environment for implementing algorithms and running experiments for online learning from evolving data streams. MOA includes a collection of offline and online methods as well as tools for evaluation. In particular, it implements boosting, bagging, and Hoeffding Trees, all with and without Naive Bayes classifiers at the leaves. MOA supports bi-directional interaction with WEKA, the Waikato Environment for Knowledge Analysis, and it is released under the GNU GPL license.

Changes:

New version November 2013


Logo JMLR EnsembleSVM 2.0

by claesenm - March 31, 2014, 08:06:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 4034 views, 1438 downloads, 2 subscriptions

About: The EnsembleSVM library offers functionality to perform ensemble learning using Support Vector Machine (SVM) base models. In particular, we offer routines for binary ensemble models using SVM base classifiers. Experimental results have shown the predictive performance to be comparable with standard SVM models but with drastically reduced training time. Ensemble learning with SVM models is particularly useful for semi-supervised tasks.

Changes:

The library has been updated and features a variety of new functionality as well as more efficient implementations of original features. The following key improvements have been made:

  1. Support for multithreading in training and prediction with ensemble models. Since both of these are embarassingly parallel, this has induced a significant speedup (3-fold on quad-core).
  2. Extensive programming framework for aggregation of base model predictions which allows highly efficient prototyping of new aggregation approaches. Additionally we provide several predefined strategies, including (weighted) majority voting, logistic regression and nonlinear SVMs of your choice -- be sure to check out the esvm-edit tool! The provided framework also allows you to efficiently program your own, novel aggregation schemes.
  3. Full code transition to C++11, the latest C++ standard, which enabled various performance improvements. The new release requires moderately recent compilers, such as gcc 4.7.2+ or clang 3.2+.
  4. Generic implementations of convenient facilities have been added, such as thread pools, deserialization factories and more.

The API and ABI have undergone significant changes, many of which are due to the transition to C++11.


Logo Malheur 0.5.4

by konrad - December 25, 2013, 13:20:31 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 11367 views, 2207 downloads, 1 subscription

About: Automatic Analysis of Malware Behavior using Machine Learning

Changes:

Support for new version of libarchive. Minor bug fixes.


Logo Gesture Recogition Toolkit 0.1 Revision 289

by ngillian - December 13, 2013, 22:59:53 CET [ Project Homepage BibTeX Download ] 2877 views, 536 downloads, 1 subscription

About: The Gesture Recognition Toolkit (GRT) is a cross-platform, open-source, c++ machine learning library that has been specifically designed for real-time gesture recognition. It features a large number of machine-learning algorithms for both classification and regression in addition to a wide range of supporting algorithms for pre-processing, feature extraction and dataset management. The GRT has been designed for real-time gesture recognition, but it can also be applied to more general machine-learning tasks.

Changes:

Added Decision Tree and Random Forests.


Logo LIBOL 0.3.0

by stevenhoi - December 12, 2013, 15:26:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6412 views, 1874 downloads, 2 subscriptions

About: LIBOL is an open-source library with a family of state-of-the-art online learning algorithms for machine learning and big data analytics research. The current version supports 16 online algorithms for binary classification and 13 online algorithms for multiclass classification.

Changes:

In contrast to our last version (V0.2.3), the new version (V0.3.0) has made some important changes as follows:

• Add a template and guide for adding new algorithms;

• Improve parameter settings and make documentation clear;

• Improve documentation on data formats and key functions;

• Amend the "OGD" function to use different loss types;

• Fixed some name inconsistency and other minor bugs.


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.4

by hn - November 11, 2013, 14:46:52 CET [ Project Homepage BibTeX Download ] 17231 views, 4202 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

Changes:
  • derivatives w.r.t. inducing points xu in infFITC, infFITC_Laplace, infFITC_EP so that one can treat the inducing points either as fixed given quantities or as additional hyperparameters
  • new GLM likelihood likExp for inter-arrival time modeling
  • new GLM likelihood likWeibull for extremal value regression
  • new GLM likelihood likGumbel for extremal value regression
  • new mean function meanPoly depending polynomially on the data
  • infExact can deal safely with the zero noise variance limit
  • support of GP warping through the new likelihood function likGaussWarp

Logo Hivemall 0.1

by myui - October 25, 2013, 08:43:12 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2658 views, 423 downloads, 1 subscription

About: Hivemall is a scalable machine learning library running on Hive/Hadoop, licensed under the LGPL 2.1.

Changes:
  • Enhancement

    • Added AROW regression
    • Added AROW with a hinge loss (arowh_regress())
  • Bugfix

    • Fixed a bug of null feature handling in classification/regression

Logo pySPACE 1.0

by krell84 - August 23, 2013, 21:00:32 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1382 views, 329 downloads, 1 subscription

About: --Signal Processing and Classification Environment in Python using YAML and supporting parallelization-- pySPACE is a modular software for processing of large data streams that has been specifically designed to enable distributed execution and empirical evaluation of signal processing chains. Various signal processing algorithms (so called nodes) are available within the software, from finite impulse response filters over data-dependent spatial filters (e.g. CSP, xDAWN) to established classifiers (e.g. SVM, LDA). pySPACE incorporates the concept of node and node chains of the MDP framework. Due to its modular architecture, the software can easily be extended with new processing nodes and more general operations. Large scale empirical investigations can be configured using simple text- configuration files in the YAML format, executed on different (distributed) computing modalities, and evaluated using an interactive graphical user interface.

Changes:

First release. Initial Announcement on mloss.org.


Logo Apache Mahout 0.8

by gsingers - July 27, 2013, 15:52:32 CET [ Project Homepage BibTeX Download ] 14592 views, 4092 downloads, 2 subscriptions

About: Apache Mahout is an Apache Software Foundation project with the goal of creating both a community of users and a scalable, Java-based framework consisting of many machine learning algorithm [...]

Changes:

Apache Mahout 0.8 contains, amongst a variety of performance improvements and bug fixes, an implementation of Streaming K-Means, deeper Lucene/Solr integration and new scalable recommender algorithms. For a full description of the newest release, see http://mahout.apache.org/.


Logo JMLR Jstacs 2.1

by keili - June 3, 2013, 07:32:55 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 13245 views, 3104 downloads, 2 subscriptions

About: A Java framework for statistical analysis and classification of biological sequences

Changes:

New classes:

  • MultipleIterationsCondition: Requires another TerminationCondition to fail a contiguous, specified number of times
  • ClassifierFactory: Allows for creating standard classifiers
  • SeqLogoPlotter: Plot PNG sequence logos from within Jstacs
  • MultivariateGaussianEmission: Multivariate Gaussian emission density for a Hidden Markov Model
  • MEManager: Maximum entropy model

New features and improvements:

  • Alignment: Added free shift alignment
  • PerformanceMeasure and sub-classes: Extension to weighted test data
  • AbstractClassifier, ClassifierAssessment and sub-classes: Adaption to weighted PerformanceMeasures
  • DNAAlphabet: Parser speed-up
  • PFMComparator: Extension to PFM from other sources/databases
  • ToolBox: New convenience methods for computing several statistics (e.g., median, correlation)
  • SignificantMotifOccurrencesFinder: New methods for computing PWMs and statistics from predictions
  • SequenceScore and sub-classes: New method toString(NumberFormat)
  • DataSet: Adaption to weighted data, e.g., partitioning
  • REnvironment: Changed several methods from String to CharSequence

Restructuring:

  • changed MultiDimensionalSequenceWrapperDiffSM to MultiDimensionalSequenceWrapperDiffSS

Several minor new features, bug fixes, and code cleanups


Logo Cognitive Foundry 3.3.3

by Baz - May 21, 2013, 05:59:37 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 15316 views, 2428 downloads, 2 subscriptions

About: The Cognitive Foundry is a modular Java software library of machine learning components and algorithms designed for research and applications.

Changes:
  • General:
    • Made code able to compile under both Java 1.6 and 1.7. This required removing some potentially unsafe methods that used varargs with generics.
    • Upgraded XStream dependency to 1.4.4.
    • Improved support for regression algorithms in learning.
    • Added general-purpose adapters to make it easier to compose learning algorithms and adapt their input or output.
  • Common Core:
    • Added isSparse, toArray, dotDivide, and dotDivideEquals methods for Vector and Matrix.
    • Added scaledPlus, scaledPlusEquals, scaledMinus, and scaledMinusEquals to Ring (and thus Vector and Matrix) for potentially faster such operations.
    • Fixed issue where matrix and dense vector equals was not checking for equal dimensionality.
    • Added transform, transformEquals, tranformNonZeros, and transformNonZerosEquals to Vector.
    • Made LogNumber into a signed version of a log number and moved the prior unsigned implementation into UnsignedLogNumber.
    • Added EuclideanRing interface that provides methods for times, timesEquals, divide, and divideEquals. Also added Field interface that provides methods for inverse and inverseEquals. These interfaces are now implemented by the appropriate number classes such as ComplexNumber, MutableInteger, MutableLong, MutableDouble, LogNumber, and UnsignedLogNumber.
    • Added interface for Indexer and DefaultIndexer implementation for creating a zero-based indexing of values.
    • Added interfaces for MatrixFactoryContainer and DivergenceFunctionContainer.
    • Added ReversibleEvaluator, which various identity functions implement as well as a new utility class ForwardReverseEvaluatorPair to create a reversible evaluator from a pair of other evaluators.
    • Added method to create an ArrayList from a pair of values in CollectionUtil.
    • ArgumentChecker now properly throws assertion errors for NaN values. Also added checks for long types.
    • Fixed handling of Infinity in subtraction for LogMath.
    • Fixed issue with angle method that would cause a NaN if cosine had a rounding error.
    • Added new createMatrix methods to MatrixFactory that initializes the Matrix with the given value.
    • Added copy, reverse, and isEmpty methods for several array types to ArrayUtil.
    • Added utility methods for creating a HashMap, LinkedHashMap, HashSet, or LinkedHashSet with an expected size to CollectionUtil.
    • Added getFirst and getLast methods for List types to CollectionUtil.
    • Removed some calls to System.out and Exception.printStackTrace.
  • Common Data:
    • Added create method for IdentityDataConverter.
    • ReversibleDataConverter now is an extension of ReversibleEvaluator.
  • Learning Core:
    • Added general learner transformation capability to make it easier to adapt and compose algorithms. InputOutputTransformedBatchLearner provides this capability for supervised learning algorithms by composing together a triplet. CompositeBatchLearnerPair does it for a pair of algorithms.
    • Added a constant and identity learners.
    • Added Chebyshev, Identity, and Minkowski distance metrics.
    • Added methods to DatasetUtil to get the output values for a dataset and to compute the sum of weights.
    • Made generics more permissive for supervised cost functions.
    • Added ClusterDistanceEvaluator for taking a clustering that encodes the distance from an input value to all clusters and returns the result as a vector.
    • Fixed potential round-off issue in decision tree splitter.
    • Added random subspace technique, implemented in RandomSubspace.
    • Separated functionality from LinearFunction into IdentityScalarFunction. LinearFunction by default is the same, but has parameters that can change the slope and offset of the function.
    • Default squashing function for GeneralizedLinearModel and DifferentiableGeneralizedLinearModel is now a linear function instead of an atan function.
    • Added a weighted estimator for the Poisson distribution.
    • Added Regressor interface for evaluators that are the output of (single-output) regression learning algorithms. Existing such evaluators have been updated to implement this interface.
    • Added support for regression ensembles including additive and averaging ensembles with and without weights. Added a learner for regression bagging in BaggingRegressionLearner.
    • Added a simple univariate regression class in UnivariateLinearRegression.
    • MultivariateDecorrelator now is a VectorInputEvaluator and VectorOutputEvaluator.
    • Added bias term to PrimalEstimatedSubGradient.
  • Text Core:
    • Fixed issue with the start position for tokens from LetterNumberTokenizer being off by one except for the first one.

Logo MLDemos 0.5.1

by basilio - March 2, 2013, 16:06:13 CET [ Project Homepage BibTeX Download ] 17021 views, 4062 downloads, 2 subscriptions

About: MLDemos is a user-friendly visualization interface for various machine learning algorithms for classification, regression, clustering, projection, dynamical systems, reward maximisation and reinforcement learning.

Changes:

New Visualization and Dataset Features Added 3D visualization of samples and classification, regression and maximization results Added Visualization panel with individual plots, correlations, density, etc. Added Editing tools to drag/magnet data, change class, increase or decrease dimensions of the dataset Added categorical dimensions (indexed dimensions with non-numerical values) Added Dataset Editing panel to swap, delete and rename dimensions, classes or categorical values Several bug-fixes for display, import/export of data, classification performance

New Algorithms and methodologies Added Projections to pre-process data (which can then be classified/regressed/clustered), with LDA, PCA, KernelPCA, ICA, CCA Added Grid-Search panel for batch-testing ranges of values for up to two parameters at a time Added One-vs-All multi-class classification for non-multi-class algorithms Trained models can now be kept and tested on new data (training on one dataset, testing on another) Added a dataset generator panel for standard toy datasets (e.g. swissroll, checkerboard,...) Added a number of clustering, regression and classification algorithms (FLAME, DBSCAN, LOWESS, CCA, KMEANS++, GP Classification, Random Forests) Added Save/Load Model option for GMMs and SVMs Added Growing Hierarchical Self Organizing Maps (original code by Michael Dittenbach) Added Automatic Relevance Determination for SVM with RBF kernel (Thanks to Ashwini Shukla!)


Logo Orange 2.6

by janez - February 14, 2013, 18:15:08 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 11024 views, 2187 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 1 vote)

About: Orange is a component-based machine learning and data mining software. It includes a friendly yet powerful and flexible graphical user interface for visual programming. For more advanced use(r)s, [...]

Changes:

The core of the system (except the GUI) no longer includes any GPL code and can be licensed under the terms of BSD upon request. The graphical part remains under GPL.

Changed the BibTeX reference to the paper recently published in JMLR MLOSS.


Showing Items 1-20 of 67 on page 1 of 4: 1 2 3 4 Next