Projects running under platform independent.
Showing Items 1-20 of 99 on page 1 of 5: 1 2 3 4 5 Next

Logo JMLR Jstacs 2.3

by keili - September 13, 2017, 14:25:38 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 31182 views, 7154 downloads, 4 subscriptions

About: A Java framework for statistical analysis and classification of biological sequences

Changes:

New classes and packages:

  • Jstacs 2.3 is the first release to be accompanied by JstacsFX, a library for building JavaFX-based graphical user interfaces based on JstacsTools
  • new interface MultiThreadedFunction
  • new class LargeSequenceReader for reading large sequence files in chunks
  • new interface QuickScanningSequenceScore
  • new class RegExpValidator for checking String inputs against a regular expression
  • new class IUPACDNAAlphabet

New features and improvements:

  • Alignments may now handle different costs for insert and delete gaps
  • ListResults may now be constructed from Collections of ResultSets
  • Several minor improvements and bugfixes in many classes
  • Improvements of documentation of several classes

Logo JMLR MLPACK 2.2.5

by rcurtin - August 26, 2017, 06:07:47 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 90100 views, 16124 downloads, 6 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 1 vote)

About: A scalable, fast C++ machine learning library, with emphasis on usability.

Changes:

Released August 25, 2017.

  • Compilation fix for some systems (#1082).

  • Fix PARAM_INT_OUT() (#1100).


Logo KeLP 2.2.1

by kelpadmin - August 7, 2017, 17:20:39 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 18618 views, 3932 downloads, 3 subscriptions

About: Kernel-based Learning Platform (KeLP) is Java framework that supports the implementation of kernel-based learning algorithms, as well as an agile definition of kernel functions over generic data representation, e.g. vectorial data or discrete structures. The framework has been designed to decouple kernel functions and learning algorithms, through the definition of specific interfaces. Once a new kernel function has been implemented, it can be automatically adopted in all the available kernel-machine algorithms. KeLP includes different Online and Batch Learning algorithms for Classification, Regression and Clustering, as well as several Kernel functions, ranging from vector-based to structural kernels. It allows to build complex kernel machine based systems, leveraging on JSON/XML interfaces to instantiate prediction models without writing a single line of code.

Changes:

In addition to minor bug fixes, this release includes:

  • A new cache (FixSizeKernelCache) that can store a larger number of computations.

  • Evaluators for measuring the quality of Clustering algorithms.

Furthermore we also released the new module kelp-input-generator, that contains the facilities to parse text snippets and generate tree representations for KeLP!

Check out this new version from our repositories. API Javadoc is already available. Your suggestions will be very precious for us, so download and try KeLP 2.2.1!


Logo KeBABS 1.5.4

by UBod - July 28, 2017, 09:55:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 21587 views, 3876 downloads, 3 subscriptions

About: Kernel-Based Analysis of Biological Sequences

Changes:
  • importing apcluster package for avoiding method clashes
  • improved and completed change history in inst/NEWS and package vignette

Logo APCluster 1.4.4

by UBod - July 28, 2017, 09:47:32 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 46720 views, 7798 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 2 votes)

About: The apcluster package implements Frey's and Dueck's Affinity Propagation clustering in R. The package further provides leveraged affinity propagation, exemplar-based agglomerative clustering, and various tools for visual analysis of clustering results.

Changes:
  • changed dependency to suggested package 'kebabs' to version of at least 1.5.4 for improved interoperability
  • bug fix in as.dendrogram() method with signature 'AggExResult'
  • added discrepancy metric to distance computations and updated src/distanceL.c to new version
  • registered C/C++ subroutines
  • minor change in the vignette template
  • moved NEWS to inst/NEWS
  • added inst/COPYRIGHT

Logo MLweb 1.0

by lauerfab - July 7, 2017, 14:43:52 CET [ Project Homepage BibTeX Download ] 11249 views, 2689 downloads, 3 subscriptions

About: MLweb is an open source project that aims at bringing machine learning capabilities into web pages and web applications, while maintaining all computations on the client side. It includes (i) a javascript library to enable scientific computing within web pages, (ii) a javascript library implementing machine learning algorithms for classification, regression, clustering and dimensionality reduction, (iii) a web application providing a matlab-like development environment.

Changes:
  • Faster LeastSquares and RidgeRegression with conjugate gradient method
  • LeastSquares now works also with sparse X
  • Faster thin SVD for tall matrices
  • Fix load data file in LALOLab
  • Add examples in LALOLab

Logo SparklingGraph 0.0.7

by riomus - May 22, 2017, 15:29:28 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6986 views, 1514 downloads, 3 subscriptions

About: Large scale, distributed graph processing made easy.

Changes:

Graph partitioning methods APSP approximation method Performance improvements License change Bug fixes


Logo Kernel Adaptive Filtering Toolbox 2.0

by steven2358 - May 22, 2017, 10:05:33 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 11571 views, 1906 downloads, 2 subscriptions

About: A Matlab benchmarking toolbox for online and adaptive regression with kernels.

Changes:
  • Changes in algorithms' Matlab class format
  • New algorithms
  • Minor improvements and bug fixes

Logo Calibrated AdaMEC 1.0

by nnikolaou - April 8, 2017, 13:57:45 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1620 views, 306 downloads, 3 subscriptions

About: Code for Calibrated AdaMEC for binary cost-sensitive classification. The method is just AdaBoost that properly calibrates its probability estimates and uses a cost-sensitive (i.e. risk-minimizing) decision threshold to classify new data.

Changes:

Updated license information


Logo revrand 1.0.0

by dsteinberg - January 29, 2017, 04:33:54 CET [ Project Homepage BibTeX Download ] 15018 views, 3158 downloads, 3 subscriptions

Rating Empty StarEmpty StarEmpty StarEmpty StarEmpty Star
(based on 1 vote)

About: A library of scalable Bayesian generalised linear models with fancy features

Changes:
  • 1.0 release!
  • Now there is a random search phase before optimization of all hyperparameters in the regression algorithms. This improves the performance of revrand since local optima are more easily avoided with this improved initialisation
  • Regression regularizers (weight variances) associated with each basis object, this approximates GP kernel addition more closely
  • Random state can be set for all random objects
  • Numerous small improvements to make revrand production ready
  • Final report
  • Documentation improvements

Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 4.0

by hn - October 19, 2016, 10:15:05 CET [ Project Homepage BibTeX Download ] 47788 views, 10496 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave/Matlab implementation of inference and prediction with Gaussian process models. The toolbox offers exact inference, approximate inference for non-Gaussian likelihoods (Laplace's Method, Expectation Propagation, Variational Bayes) as well for large datasets (FITC, VFE, KISS-GP). A wide range of covariance, likelihood, mean and hyperprior functions allows to create very complex GP models.

Changes:

A major code restructuring effort did take place in the current release unifying certain inference functions and allowing more flexibility in covariance function composition. We also redesigned the whole derivative computation pipeline to strongly improve the overall runtime. We finally include grid-based covariance approximations natively.

More generic sparse approximation using Power EP

  • unified treatment of FITC approximation, variational approaches VFE and hybrids

  • inducing input optimisation for all (compositions of) covariance functions dropping the previous limitation to a few standard examples

  • infFITC is now covered by the more generic infGaussLik function

Approximate covariance object unifying sparse approximations, grid-based approximations and exact covariance computations

  • implementation in cov/apx, cov/apxGrid, cov/apxSparse

  • generic infGaussLik unifies infExact, infFITC and infGrid

  • generic infLaplace unifies infLaplace, infFITC_Laplace and infGrid_Laplace

Hiearchical structure of covariance functions

  • clear hierachical compositional implementation

  • no more code duplication as present in covSEiso and covSEard pairs

  • two mother covariance functions

    • covDot for dot-product-based covariances and

    • covMaha for Mahalanobis-distance-based covariances

  • a variety of modifiers: eye, iso, ard, proj, fact, vlen

  • more flexibility as more variants are available and possible

  • all covariance functions offer derivatives w.r.t. inputs

Faster derivative computations for mean and cov functions

  • switched from partial derivatives to directional derivatives

  • simpler and more concise interface of mean and cov functions

  • much faster marginal likelihood derivative computations

  • simpler and more compact code

New mean functions

  • new mean/meanWSPC (Weighted Sum of Projected Cosines or Random Kitchen Sink features) following a suggestion by William Herlands

  • new mean/meanWarp for constructing a new mean from an existing one by means of a warping function adapted from William Herlands

New optimizer

  • added a new minimize_minfunc, contributed by Truong X. Nghiem

New GLM link function

  • added the twice logistic link function util/glm_invlink_logistic2

Smaller fixes

  • two-fold speedup of util/elsympol used by covADD by Truong X. Nghiem

  • bugfix in util/logphi as reported by John Darby


Logo AMIDST Toolbox 0.6.0

by ana - October 14, 2016, 19:35:27 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 8239 views, 1502 downloads, 4 subscriptions

About: A Java Toolbox for Scalable Probabilistic Machine Learning.

Changes:
  • Added sparklink module implementing the integration with Apache Spark. More information here.
  • Fluent pattern in latent-variable-models
  • Predefined model implementing the concept drift detection

Detailed information can be found in the toolbox's web page


About: Nowadays this is very popular to use the deep architectures in machine learning. Deep Belief Networks (DBNs) are deep architectures that use a stack of Restricted Boltzmann Machines (RBM) to create a powerful generative model using training data. DBNs have many abilities such as feature extraction and classification that are used in many applications including image processing, speech processing, text categorization, etc. This paper introduces a new object oriented toolbox with the most important abilities needed for the implementation of DBNs. According to the results of the experiments conducted on the MNIST (image), ISOLET (speech), and the 20 Newsgroups (text) datasets, it was shown that the toolbox can learn automatically a good representation of the input from unlabeled data with better discrimination between different classes. Also on all the aforementioned datasets, the obtained classification errors are comparable to those of the state of the art classifiers. In addition, the toolbox supports different sampling methods (e.g. Gibbs, CD, PCD and our new FEPCD method), different sparsity methods (quadratic, rate distortion and our new normal method), different RBM types (generative and discriminative), GPU based, etc. The toolbox is a user-friendly open source software in MATLAB and Octave and is freely available on the website.

Changes:

New in toolbox

  • Using GPU in Backpropagation
  • Revision of some demo scripts
  • Function approximation with multiple outputs
  • Feature extraction with GRBM in first layer

cardinal


Logo JMLR GPstuff 4.7

by avehtari - June 9, 2016, 17:45:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 47799 views, 11845 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2016-06-09 Version 4.7

Development and release branches available at https://github.com/gpstuff-dev/gpstuff

New features

  • Simple Bayesian Optimization demo

Improvements

  • Improved use of PSIS
  • More options added to gp_monotonic
  • Monotonicity now works for additive covariance functions with selected variables
  • Possibility to use gpcf_squared.m-covariance function with derivative observations/monotonicity
  • Default behaviour made more robust by changing default jitter from 1e-9 to 1e-6
  • LA-LOO uses the cavity method as the default (see Vehtari et al (2016). Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models. JMLR, accpeted for publication)
  • Selected variables -option works now better with monotonicity

Bugfixes

  • small error in derivative observation computation fixed
  • several minor bug fixes

Logo ELKI 0.7.1

by erich - March 14, 2016, 13:44:02 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 29157 views, 5078 downloads, 4 subscriptions

About: ELKI is a framework for implementing data-mining algorithms with support for index structures, that includes a wide variety of clustering and outlier detection methods.

Changes:

Additions and improvements from ELKI 0.7.0 to 0.7.1:

Algorithm additions:

  • GriDBSCAN: DBSCAN using grid partitioning (Minkowski distances only)

  • Compare-Means and Sort-Means k-means variations (much faster than traditional k-means)

  • Visualization of dendrograms.

Important bug fixes:

  • Classes with no package ("default package") would cause errors.

  • The fast power function implementation was sometimes returning incorrect results.

  • Random sampling was sometimes not sampling from the full data set.

UI improvements:

  • The file input source will now automatically choose the Arff parser for .arff files.

  • MiniGUI now allows choosing other applications.

  • MiniGUI now displays the command line in a separate field.

  • MiniGUI displays an error message, if an incorrect classpath or JAyatana (on Ubuntu) is detected.

  • Export to png now works, we added a work-around for an open Batik bug.

Smaller changes:

  • Many smaller bug fixes.

  • C-Index for cluster evaluation now can process larger data sets.

  • OPTICS output of undefined reachability fixed.

  • External distance matrixes are easier to use and perform additional checks.

  • Precomputed distance matrixes can answer range and kNN queries.

  • Voronoi visualization can be switched in the menu now.

  • Improved backwards command line compatibility with additional aliases.

  • Added generated @since annotations in JavaDoc.

  • Many new unit tests, renamed to the Java conventions.

  • Low-level reading of service files, to have faster startup.


Logo MDLText 1

by renatoms88 - March 3, 2016, 19:31:25 CET [ BibTeX Download ] 1212 views, 503 downloads, 2 subscriptions

About: testing mloss.org

Changes:

Initial Announcement on mloss.org.


Logo BayesPy 0.4.1

by jluttine - November 2, 2015, 13:40:09 CET [ Project Homepage BibTeX Download ] 21371 views, 4639 downloads, 3 subscriptions

About: Variational Bayesian inference tools for Python

Changes:
  • Define extra dependencies needed to build the documentation

Logo Cognitive Foundry 3.4.2

by Baz - October 30, 2015, 06:53:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 35575 views, 5997 downloads, 4 subscriptions

About: The Cognitive Foundry is a modular Java software library of machine learning components and algorithms designed for research and applications.

Changes:
  • General:
    • Upgraded MTJ to 1.0.3.
  • Common:
    • Added package for hash function computation including Eva, FNV-1a, MD5, Murmur2, Prime, SHA1, SHA2
    • Added callback-based forEach implementations to Vector and InfiniteVector, which can be faster for iterating through some vector types.
    • Optimized DenseVector by removing a layer of indirection.
    • Added method to compute set of percentiles in UnivariateStatisticsUtil and fixed issue with percentile interpolation.
    • Added utility class for enumerating combinations.
    • Adjusted ScalarMap implementation hierarchy.
    • Added method for copying a map to VectorFactory and moved createVectorCapacity up from SparseVectorFactory.
    • Added method for creating square identity matrix to MatrixFactory.
    • Added Random implementation that uses a cached set of values.
  • Learning:
    • Implemented feature hashing.
    • Added factory for random forests.
    • Implemented uniform distribution over integer values.
    • Added Chi-squared similarity.
    • Added KL divergence.
    • Added general conditional probability distribution.
    • Added interfaces for Regression, UnivariateRegression, and MultivariateRegression.
    • Fixed null pointer exception that can happen in K-means with an empty cluster.
    • Fixed name of maxClusters property on AgglomerativeClusterer (was called maxMinDistance).
  • Text:
    • Improvements to LDA Gibbs sampler.

Logo KEEL Knowledge Extraction based on Evolutionary Learning 3.0

by keel - September 18, 2015, 12:38:54 CET [ Project Homepage BibTeX Download ] 2885 views, 707 downloads, 1 subscription

About: KEEL (Knowledge Extraction based on Evolutionary Learning) is an open source (GPLv3) Java software tool that can be used for a large number of different knowledge data discovery tasks. KEEL provides a simple GUI based on data flow to design experiments with different datasets and computational intelligence algorithms (paying special attention to evolutionary algorithms) in order to assess the behavior of the algorithms. It contains a wide variety of classical knowledge extraction algorithms, preprocessing techniques (training set selection, feature selection, discretization, imputation methods for missing values, among others), computational intelligence based learning algorithms, hybrid models, statistical methodologies for contrasting experiments and so forth. It allows to perform a complete analysis of new computational intelligence proposals in comparison to existing ones. Moreover, KEEL has been designed with a two-fold goal: research and educational. KEEL is also coupled with KEEL-dataset: a webpage that aims at providing to the machine learning researchers a set of benchmarks to analyze the behavior of the learning methods. Concretely, it is possible to find benchmarks already formatted in KEEL format for classification (such as standard, multi instance or imbalanced data), semi-supervised classification, regression, time series and unsupervised learning. Also, a set of low quality data benchmarks is maintained in the repository.

Changes:

Initial Announcement on mloss.org.


Logo Java Data Mining Package 0.3.0

by arndt - August 19, 2015, 15:44:46 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3436 views, 759 downloads, 3 subscriptions

About: A Java library for machine learning and data analytics

Changes:

Initial Announcement on mloss.org.


Showing Items 1-20 of 99 on page 1 of 5: 1 2 3 4 5 Next