All entries.
Showing Items 61-80 of 676 on page 4 of 34: Previous 1 2 3 4 5 6 7 8 9 Next Last

Logo r-cran-pamr 1.54

by r-cran-robot - April 1, 2013, 00:00:06 CET [ Project Homepage BibTeX Download ] 58611 views, 12553 downloads, 0 subscriptions

About: Pam

Changes:

Fetched by r-cran-robot on 2013-04-01 00:00:06.709586


Logo JMLR SSA Toolbox 1.3

by paulbuenau - January 24, 2012, 15:51:02 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 47691 views, 12439 downloads, 0 subscriptions

About: The SSA Toolbox is an efficient, platform-independent, standalone implementation of the Stationary Subspace Analysis algorithm with a friendly graphical user interface and a bridge to Matlab. Stationary Subspace Analysis (SSA) is a general purpose algorithm for the explorative analysis of non-stationary data, i.e. data whose statistical properties change over time. SSA helps to detect, investigate and visualize temporal changes in complex high-dimensional data sets.

Changes:
  • Various bugfixes.

Logo MDP Modular toolkit for Data Processing 3.3

by otizonaizit - October 4, 2012, 15:17:33 CET [ Project Homepage BibTeX Download ] 48211 views, 12061 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 3 votes)

About: MDP is a Python library of widely used data processing algorithms that can be combined according to a pipeline analogy to build more complex data processing software. The base of available algorithms includes signal processing methods (Principal Component Analysis, Independent Component Analysis, Slow Feature Analysis), manifold learning methods ([Hessian] Locally Linear Embedding), several classifiers, probabilistic methods (Factor Analysis, RBM), data pre-processing methods, and many others.

Changes:

What's new in version 3.3?

  • support sklearn versions up to 0.12
  • cleanly support reload
  • fail gracefully if pp server does not start
  • several bug-fixes and improvements

Logo PyMVPA Multivariate Pattern Analysis in Python 2.0.0

by yarikoptic - December 22, 2011, 01:36:32 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 65883 views, 11886 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 2 votes)

About: Python module to ease pattern classification analyses of large datasets. It provides high-level abstraction of typical processing steps (e.g. data preparation, classification, feature selection, [...]

Changes:
  • 2.0.0 (Mon, Dec 19 2011)

This release aggregates all the changes occurred between official releases in 0.4 series and various snapshot releases (in 0.5 and 0.6 series). To get better overview of high level changes see :ref:release notes for 0.5 <chap_release_notes_0.5> and :ref:0.6 <chap_release_notes_0.6> as well as summaries of release candidates below

  • Fixes (23 BF commits)

    • significance level in the right tail was fixed to include the value tested -- otherwise resulted in optimistic bias (or absurdly high significance in improbable case if all estimates having the same value)
    • compatible with the upcoming IPython 0.12 and renamed sklearn (Fixes #57)
    • do not double-train slave classifiers while assessing sensitivities (Fixes #53)
  • Enhancements (30 ENH + 3 NF commits)

    • resolving voting ties in kNN based on mean distance, and randomly in SMLR
    • :class:kNN's ca.estimates now contains dictionaries with votes for each class
    • consistent zscoring in :class:Hyperalignment
  • 2.0.0~rc5 (Wed, Oct 19 2011)

  • Major: to allow easy co-existence of stable PyMVPA 0.4.x, 0.6 development mvpa module was renamed into mod:mvpa2.

  • Fixes

    • compatible with the new Shogun 1.x series
    • compatible with the new h5py 2.x series
    • mvpa-prep-fmri -- various compatibility fixes and smoke testing
    • deepcopying :class:SummaryStatistics during add
  • Enhancements

    • tutorial uses :mod:mvpa2.tutorial_suite now
    • better suppression of R warnings when needed
    • internal attributes of many classes were exposed as properties
    • more unification of __repr__ for many classes
  • 0.6.0~rc4 (Wed, Jun 14 2011)

  • Fixes

    • Finished transition to :mod:nibabel conventions in plot_lightbox
    • Addressed :mod:matplotlib.hist API change
    • Various adjustments in the tests batteries (:mod:nibabel 1.1.0 compatibility, etc)
  • New functionality

    • Explicit new argument flatten to from_wizard -- default behavior changed if mapper was provided as well
  • Enhancements

    • Elaborated __str__ and __repr__ for some Classifiers and Measures
  • 0.6.0~rc3 (Thu, Apr 12 2011)

  • Fixes

    • Bugfixes regarding the interaction of FlattenMapper and BoxcarMapper that affected event-related analyses.
    • Splitter now handles attribute value None for splitting properly.
    • GNBSearchlight handling of
      roi_ids.
    • More robust detection of mod:scikits.learn and :mod:nipy externals.
  • New functionality

    • Added a Repeater node to yield a dataset multiple times and
      Sifter node to exclude some datasets. Consequently, the "nosplitting" mode of Splitter got removed at the same time.
    • :file:tools/niils -- little tool to list details (dimensionality, scaling, etc) of the files in nibabel-supported formats.
  • Enhancements

    • Numerous documentation fixes.
    • Various improvements and increased flexibility of null distribution estimation of Measures.
    • All attribute are now reported in sorted order when printing a dataset.
    • fmri_dataset now also stores the input image type.
    • Crossvalidation can now take a custom Splitter instance. Moreover, the default splitter of CrossValidation is more robust in terms of number and type of created splits for common usage patterns (i.e. together with partitioners).
    • CrossValidation takes any custom Node as errorfx argument.
    • ConfusionMatrix can now be used as an errorfx in Crossvalidation.
    • LOE(ACC): Linear Order Effect in ACC was added to
      ConfusionMatrix to detect trends in performances across splits.
    • A Node s postproc is now accessible as a property.
    • RepeatedMeasure has a new 'concat_as' argument that allows results to be concatenated along the feature axis. The default behavior, stacking as multiple samples, is unchanged.
    • Searchlight now has the ability to mark the center/seed of an ROI in with a feature attribute in the generated datasets.
    • debug takes args parameter for delayed string comprehensions. It should reduce run-time impact of debug() calls in regular, non -O mode of Python operation.
    • String summaries and representations (provided by __str__ and __repr__) were made more exhaustive and more coherent. Additional properties to access initial constructor arguments were added to variety of classes.
  • Internal changes

    • New debug target STDOUT to allow attaching metrics (e.g. traceback, timestamps) to regular output printed to stdout

    • New set of decorators to help with unittests

    • @nodebug to disable specific debug targets for the duration of the test.

    • @reseed_rng to guarantee consistent random data given initial seeding.

    • @with_tempfile to provide a tempfile name which would get removed upon completion (test success or failure)

    • Dropping daily testing of maint/0.5 branch -- RIP.

    • Collection s were provided with adequate (deep|)copy. And Dataset was refactored to use Collection s copy method.

    • update-* Makefile rules automatically should fast-forward corresponding website-updates branch

    • MVPA_TESTS_VERBOSITY controls also :mod:numpy warnings now.

    • Dataset.__array__ provides original array instead of copy (unless dtype is provided)

Also adapts changes from 0.4.6 and 0.4.7 (see corresponding changelogs).

  • 0.6.0~rc2 (Thu, Mar 3 2011)

  • Various fixes in the mvpa.atlas module.

  • 0.6.0~rc1 (Thu, Feb 24 2011)

  • Many, many, many

  • For an overview of the most drastic changes :ref:see constantly evolving release notes for 0.6 <chap_release_notes_0.6>

  • 0.5.0 (sometime in March 2010)

This is a special release, because it has never seen the general public. A summary of fundamental changes introduced in this development version can be seen in the :ref:release notes <chap_release_notes_0.5>.

Most notably, this version was to first to come with a comprehensive two-day workshop/tutorial.

  • 0.4.7 (Tue, Mar 07 2011) (Total: 12 commits)

A bugfix release

  • Fixed

    • Addressed the issue with input NIfTI files having scl_ fields set: it could result in incorrect analyses and map2nifti-produced NIfTI files. Now input files account for scaling/offset if scl_ fields direct to do so. Moreover upon map2nifti, those fields get reset.
    • :file:doc/examples/searchlight_minimal.py - best error is the minimal one
  • Enhancements

    • :class:~mvpa.clfs.gnb.GNB can now tolerate training datasets with a single label
    • :class:~mvpa.clfs.meta.TreeClassifier can have trailing nodes with no classifier assigned
  • 0.4.6 (Tue, Feb 01 2011) (Total: 20 commits)

A bugfix release

  • Fixed (few BF commits):

    • Compatibility with numpy 1.5.1 (histogram) and scipy 0.8.0 (workaround for a regression in legendre)
    • Compatibility with libsvm 3.0
    • :class:~mvpa.clfs.plr.PLR robustification
  • Enhancements

    • Enforce suppression of numpy warnings while running unittests. Also setting verbosity >= 3 enables all warnings (Python, NumPy, and PyMVPA)
    • :file:doc/examples/nested_cv.py example (adopted from 0.5)
    • Introduced base class :class:~mvpa.clfs.base.LearnerError for classifiers' exceptions (adopted from 0.5)
    • Adjusted example data to live upto nibabel's warranty of NIfTI standard-compliance
    • More robust operation of MC iterations -- skip iterations where classifier experienced difficulties and raise an exception (e.g. due to degenerate data)

Logo r-cran-CoxBoost 1.4

by r-cran-robot - January 1, 2018, 00:00:07 CET [ Project Homepage BibTeX Download ] 55761 views, 11719 downloads, 0 subscriptions

About: Cox models by likelihood based boosting for a single survival endpoint or competing risks

Changes:

Fetched by r-cran-robot on 2018-01-01 00:00:07.280476


Logo XGBoost v0.4.0

by crowwork - May 12, 2015, 08:57:16 CET [ Project Homepage BibTeX Download ] 38898 views, 11557 downloads, 0 subscriptions

About: xgboost: eXtreme Gradient Boosting It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithm. The package can automatically do parallel computation with OpenMP, and it can be more than 10 times faster than existing gradient boosting packages such as gbm or sklearn.GBM . It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that user are also allowed to define there own objectives easily. The newest version of xgboost now supports distributed learning on various platforms such as hadoop, mpi and scales to even larger problems

Changes:
  • Distributed version of xgboost that runs on YARN, scales to billions of examples

  • Direct save/load data and model from/to S3 and HDFS

  • Feature importance visualization in R module, by Michael Benesty

  • Predict leaf index

  • Poisson regression for counts data

  • Early stopping option in training

  • Native save load support in R and python

  • xgboost models now can be saved using save/load in R

  • xgboost python model is now pickable

  • sklearn wrapper is supported in python module

  • Experimental External memory version


Logo Elefant 0.4

by kishorg - October 17, 2009, 08:48:19 CET [ Project Homepage BibTeX Download ] 34443 views, 11283 downloads, 0 subscriptions

Rating Whole StarWhole Star1/2 StarEmpty StarEmpty Star
(based on 2 votes)

About: Elefant is an open source software platform for the Machine Learning community licensed under the Mozilla Public License (MPL) and developed using Python, C, and C++. We aim to make it the platform [...]

Changes:

This release contains the Stream module as a first step in the direction of providing C++ library support. Stream aims to be a software framework for the implementation of large scale online learning algorithms. Large scale, in this context, should be understood as something that does not fit in the memory of a standard desktop computer.

Added Bundle Methods for Regularized Risk Minimization (BMRM) allowing to choose from a list of loss functions and solvers (linear and quadratic).

Added the following loss classes: BinaryClassificationLoss, HingeLoss, SquaredHingeLoss, ExponentialLoss, LogisticLoss, NoveltyLoss, LeastMeanSquareLoss, LeastAbsoluteDeviationLoss, QuantileRegressionLoss, EpsilonInsensitiveLoss, HuberRobustLoss, PoissonRegressionLoss, MultiClassLoss, WinnerTakesAllMultiClassLoss, ScaledSoftMarginMultiClassLoss, SoftmaxMultiClassLoss, MultivariateRegressionLoss

Graphical User Interface provides now extensive documentation for each component explaining state variables and port descriptions.

Changed saving and loading of experiments to XML (thereby avoiding storage of large input data structures).

Unified automatic input checking via new static typing extending Python properties.

Full support for recursive composition of larger components containing arbitrary statically typed state variables.


Logo r-cran-rgenoud 5.7-8.1

by r-cran-robot - June 3, 2012, 00:00:00 CET [ Project Homepage BibTeX Download ] 46127 views, 11160 downloads, 0 subscriptions

About: R version of GENetic Optimization Using Derivatives

Changes:

Fetched by r-cran-robot on 2013-04-01 00:00:08.101900


Logo Apache Mahout 0.11.1

by gsingers - November 9, 2015, 16:12:06 CET [ Project Homepage BibTeX Download ] 46536 views, 11002 downloads, 0 subscriptions

About: Apache Mahout is an Apache Software Foundation project with the goal of creating both a community of users and a scalable, Java-based framework consisting of many machine learning algorithm [...]

Changes:

Apache Mahout introduces a new math environment we call Samsara, for its theme of universal renewal. It reflects a fundamental rethinking of how scalable machine learning algorithms are built and customized. Mahout-Samsara is here to help people create their own math while providing some off-the-shelf algorithm implementations. At its core are general linear algebra and statistical operations along with the data structures to support them. You can use is as a library or customize it in Scala with Mahout-specific extensions that look something like R. Mahout-Samsara comes with an interactive shell that runs distributed operations on a Spark cluster. This make prototyping or task submission much easier and allows users to customize algorithms with a whole new degree of freedom. Mahout Algorithms include many new implementations built for speed on Mahout-Samsara. They run on Spark 1.3+ and some on H2O, which means as much as a 10x speed increase. You’ll find robust matrix decomposition algorithms as well as a Naive Bayes classifier and collaborative filtering. The new spark-itemsimilarity enables the next generation of cooccurrence recommenders that can use entire user click streams and context in making recommendations.


Logo MLweb 1.2

by lauerfab - February 23, 2018, 15:40:27 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 47660 views, 10991 downloads, 0 subscriptions

About: MLweb is an open source project that aims at bringing machine learning capabilities into web pages and web applications, while maintaining all computations on the client side. It includes (i) a javascript library to enable scientific computing within web pages, (ii) a javascript library implementing machine learning algorithms for classification, regression, clustering and dimensionality reduction, (iii) a web application providing a matlab-like development environment.

Changes:
  • Add bibtex entry of corresponding Neurocomputing paper
  • Create javascript modules to avoid global scope pollution in web pages

Logo MIToolbox 3.0.1

by apocock - March 2, 2017, 00:38:52 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 64700 views, 10715 downloads, 0 subscriptions

About: A mutual information library for C and Mex bindings for MATLAB. Aimed at feature selection, and provides simple methods to calculate mutual information, conditional mutual information, entropy, conditional entropy, Renyi entropy/mutual information, and weighted variants of Shannon entropies/mutual informations. Works with discrete distributions, and expects column vectors of features.

Changes:

Fixed a Windows compilation bug. MIToolbox v3 should now compile using Visual Studio.


Logo JMLR CAM Java 3.1

by wangny - October 14, 2013, 22:46:03 CET [ Project Homepage BibTeX Download ] 33879 views, 10675 downloads, 0 subscriptions

About: The CAM R-Java software provides a noval way to solve blind source separation problem.

Changes:

In this version, we fix the problem of not working under newest R version R-3.0.


Logo r-cran-klaR 0.6-8

by r-cran-robot - March 27, 2013, 00:00:00 CET [ Project Homepage BibTeX Download ] 45048 views, 10385 downloads, 0 subscriptions

About: Classification and visualization

Changes:

Fetched by r-cran-robot on 2013-04-01 00:00:05.722314


Logo JMLR MOA Massive Online Analysis Nov-13

by abifet - April 4, 2014, 03:50:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 36262 views, 10138 downloads, 0 subscriptions

About: Massive Online Analysis (MOA) is a real time analytic tool for data streams. It is a software environment for implementing algorithms and running experiments for online learning from evolving data streams. MOA includes a collection of offline and online methods as well as tools for evaluation. In particular, it implements boosting, bagging, and Hoeffding Trees, all with and without Naive Bayes classifiers at the leaves. MOA supports bi-directional interaction with WEKA, the Waikato Environment for Knowledge Analysis, and it is released under the GNU GPL license.

Changes:

New version November 2013


Logo r-cran-bst 0.3-15

by r-cran-robot - July 22, 2018, 00:00:00 CET [ Project Homepage BibTeX Download ] 38356 views, 9720 downloads, 0 subscriptions

About: Gradient Boosting

Changes:

Fetched by r-cran-robot on 2018-09-01 00:00:05.199020


Logo r-cran-randomForest 4.6-7

by r-cran-robot - October 16, 2012, 00:00:00 CET [ Project Homepage BibTeX Download ] 40994 views, 9714 downloads, 0 subscriptions

About: Breiman and Cutler's random forests for classification and regression

Changes:

Fetched by r-cran-robot on 2013-04-01 00:00:07.638240


Logo JMLR DLLearner 1.0

by Jens - February 13, 2015, 11:39:46 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 46473 views, 9630 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 3 votes)

About: The DL-Learner framework contains several algorithms for supervised concept learning in Description Logics (DLs) and OWL.

Changes:

See http://dl-learner.org/development/changelog/.


Logo JMLR Shark 2.3.0

by igel - October 24, 2009, 22:12:48 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 50119 views, 9491 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 3 votes)

About: SHARK is a modular C++ library for the design and optimization of adaptive systems. It provides various machine learning and computational intelligence techniques.

Changes:
  • moved to GitHub
  • new build system
  • minor bug fixes

Logo RLLib Lightweight On or Off Policy Reinforcement Learning Library 2.0

by saminda - April 25, 2014, 02:58:32 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 41789 views, 9380 downloads, 0 subscriptions

About: RLLib is a lightweight C++ template library that implements incremental, standard, and gradient temporal-difference learning algorithms in Reinforcement Learning. It is an optimized library for robotic applications and embedded devices that operates under fast duty cycles (e.g., < 30 ms). RLLib has been tested and evaluated on RoboCup 3D soccer simulation agents, physical NAO V4 humanoid robots, and Tiva C series launchpad microcontrollers to predict, control, learn behaviors, and represent learnable knowledge. The implementation of the RLLib library is inspired by the RLPark API, which is a library of temporal-difference learning algorithms written in Java.

Changes:

Current release version is v2.0.


Logo libcmaes 0.9.5

by beniz - March 9, 2015, 09:05:22 CET [ Project Homepage BibTeX Download ] 40190 views, 9322 downloads, 0 subscriptions

About: Libcmaes is a multithreaded C++11 library (with Python bindings) for high performance blackbox stochastic optimization of difficult, possibly non-linear and non-convex functions, using the CMA-ES algorithm for Covariance Matrix Adaptation Evolution Strategy. Libcmaes is useful to minimize / maximize any function, without information regarding gradient or derivability.

Changes:

This is a major release, with several novelties, improvements and fixes, among which:

  • step-size two-point adaptaion scheme for improved performances in some settings, ref #88

  • important bug fixes to the ACM surrogate scheme, ref #57, #106

  • simple high-level workflow under Python, ref #116

  • improved performances in high dimensions, ref #97

  • improved profile likelihood and contour computations, including under geno/pheno transforms, ref #30, #31, #48

  • elitist mechanism for forcing best solutions during evolution, ref 103

  • new legacy plotting function, ref #110

  • optional initial function value, ref #100

  • improved C++ API, ref #89

  • Python bindings support with Anaconda, ref #111

  • configure script now tries to detect numpy when building Python bindings, ref #113

  • Python bindings now have embedded documentation, ref #114

  • support for Travis continuous integration, ref #122

  • lower resolution random seed initialization


Showing Items 61-80 of 676 on page 4 of 34: Previous 1 2 3 4 5 6 7 8 9 Next Last