All entries.
Showing Items 61-70 of 638 on page 7 of 64: First Previous 2 3 4 5 6 7 8 9 10 11 12 Next Last

Logo Elefant 0.4

by kishorg - October 17, 2009, 08:48:19 CET [ Project Homepage BibTeX Download ] 23874 views, 8843 downloads, 2 subscriptions

Rating Whole StarWhole Star1/2 StarEmpty StarEmpty Star
(based on 2 votes)

About: Elefant is an open source software platform for the Machine Learning community licensed under the Mozilla Public License (MPL) and developed using Python, C, and C++. We aim to make it the platform [...]

Changes:

This release contains the Stream module as a first step in the direction of providing C++ library support. Stream aims to be a software framework for the implementation of large scale online learning algorithms. Large scale, in this context, should be understood as something that does not fit in the memory of a standard desktop computer.

Added Bundle Methods for Regularized Risk Minimization (BMRM) allowing to choose from a list of loss functions and solvers (linear and quadratic).

Added the following loss classes: BinaryClassificationLoss, HingeLoss, SquaredHingeLoss, ExponentialLoss, LogisticLoss, NoveltyLoss, LeastMeanSquareLoss, LeastAbsoluteDeviationLoss, QuantileRegressionLoss, EpsilonInsensitiveLoss, HuberRobustLoss, PoissonRegressionLoss, MultiClassLoss, WinnerTakesAllMultiClassLoss, ScaledSoftMarginMultiClassLoss, SoftmaxMultiClassLoss, MultivariateRegressionLoss

Graphical User Interface provides now extensive documentation for each component explaining state variables and port descriptions.

Changed saving and loading of experiments to XML (thereby avoiding storage of large input data structures).

Unified automatic input checking via new static typing extending Python properties.

Full support for recursive composition of larger components containing arbitrary statically typed state variables.


Logo revrand 1.0.0

by dsteinberg - January 29, 2017, 04:33:54 CET [ Project Homepage BibTeX Download ] 11306 views, 2294 downloads, 3 subscriptions

Rating Empty StarEmpty StarEmpty StarEmpty StarEmpty Star
(based on 1 vote)

About: A library of scalable Bayesian generalised linear models with fancy features

Changes:
  • 1.0 release!
  • Now there is a random search phase before optimization of all hyperparameters in the regression algorithms. This improves the performance of revrand since local optima are more easily avoided with this improved initialisation
  • Regression regularizers (weight variances) associated with each basis object, this approximates GP kernel addition more closely
  • Random state can be set for all random objects
  • Numerous small improvements to make revrand production ready
  • Final report
  • Documentation improvements

Logo rectools a Novel Toolbox for Recommender Systems 1.0.0

by matloff - October 29, 2016, 07:41:58 CET [ Project Homepage BibTeX Download ] 1426 views, 269 downloads, 2 subscriptions

Rating Empty StarEmpty StarEmpty StarEmpty StarEmpty Star
(based on 1 vote)

About: Novel R toolbox for collaborative filtering recommender systems.

Changes:

Initial Announcement on mloss.org.


Logo Local high order regularization 1.0

by kkim - March 2, 2016, 13:46:17 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1652 views, 410 downloads, 2 subscriptions

Rating Empty StarEmpty StarEmpty StarEmpty StarEmpty Star
(based on 1 vote)

About: Local high-order regularization for semi-supervised learning

Changes:

Initial Announcement on mloss.org.


Logo lomo feature extraction and xqda metric learning for person reidentification 1.0

by openpr_nlpr - May 6, 2015, 11:38:32 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3137 views, 473 downloads, 3 subscriptions

Rating Empty StarEmpty StarEmpty StarEmpty StarEmpty Star
(based on 1 vote)

About: This MATLAB package provides the LOMO feature extraction and the XQDA metric learning algorithms proposed in our CVPR 2015 paper. It is fast, and effective for person re-identification. For more details, please visit http://www.cbsr.ia.ac.cn/users/scliao/projects/lomo_xqda/.

Changes:

Initial Announcement on mloss.org.


Logo DynaML 1.4.1

by mandar2812 - April 20, 2017, 18:32:33 CET [ Project Homepage BibTeX Download ] 176 views, 22 downloads, 1 subscription

About: DynaML is a Scala environment for conducting research and education in Machine Learning. DynaML comes packaged with a powerful library of classes implementing predictive models and a Scala REPL where one can not only build custom models but also play around with data work-flows.

Changes:

Initial Announcement on mloss.org.


Logo pycobra regression analysis and ensemble toolkit 0.1.0

by bhargavvader - April 19, 2017, 15:04:14 CET [ Project Homepage BibTeX Download ] 182 views, 22 downloads, 2 subscriptions

About: pycobra is a python toolkit to help with regression analysis and visualisation. It provides an implementation of the COBRA predictor aggregation algorithm.

Changes:

Initial Announcement on mloss.org.


Logo Theano 0.9.0

by jaberg - April 10, 2017, 20:30:17 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 31236 views, 5264 downloads, 3 subscriptions

About: A Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. Dynamically generates CPU and GPU modules for good performance. Deep Learning Tutorials illustrate deep learning with Theano.

Changes:

Theano 0.9.0 (20th of March, 2017)

Highlights (since 0.8.0):

* Better Python 3.5 support
* Better numpy 1.12 support
* Conda packages for Mac, Linux and Windows
* Support newer Mac and Windows versions
* More Windows integration:

    * Theano scripts (``theano-cache`` and ``theano-nose``) now works on Windows
    * Better support for Windows end-lines into C codes
    * Support for space in paths on Windows

* Scan improvements:

    * More scan optimizations, with faster compilation and gradient computation
    * Support for checkpoint in scan (trade off between speed and memory usage, useful for long sequences)
    * Fixed broadcast checking in scan

* Graphs improvements:

    * More numerical stability by default for some graphs
    * Better handling of corner cases for theano functions and graph optimizations
    * More graph optimizations with faster compilation and execution
    * smaller and more readable graph

* New GPU back-end:

    * Removed warp-synchronous programming to get good results with newer CUDA drivers
    * More pooling support on GPU when cuDNN isn't available
    * Full support of ignore_border option for pooling
    * Inplace storage for shared variables
    * float16 storage
    * Using PCI bus ID of graphic cards for a better mapping between theano device number and nvidia-smi number
    * Fixed offset error in ``GpuIncSubtensor``

* Less C code compilation
* Added support for bool dtype
* Updated and more complete documentation
* Bug fixes related to merge optimizer and shape inference
* Lot of other bug fixes, crashes fixes and warning improvements

Logo Calibrated AdaMEC 1.0

by nnikolaou - April 8, 2017, 13:57:45 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 492 views, 58 downloads, 2 subscriptions

About: Code for Calibrated AdaMEC for binary cost-sensitive classification. The method is just AdaBoost that properly calibrates its probability estimates and uses a cost-sensitive (i.e. risk-minimizing) decision threshold to classify new data.

Changes:

Initial Announcement on mloss.org.


Logo KeLP 2.2.0

by kelpadmin - April 7, 2017, 16:51:42 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 14150 views, 3134 downloads, 3 subscriptions

About: Kernel-based Learning Platform (KeLP) is Java framework that supports the implementation of kernel-based learning algorithms, as well as an agile definition of kernel functions over generic data representation, e.g. vectorial data or discrete structures. The framework has been designed to decouple kernel functions and learning algorithms, through the definition of specific interfaces. Once a new kernel function has been implemented, it can be automatically adopted in all the available kernel-machine algorithms. KeLP includes different Online and Batch Learning algorithms for Classification, Regression and Clustering, as well as several Kernel functions, ranging from vector-based to structural kernels. It allows to build complex kernel machine based systems, leveraging on JSON/XML interfaces to instantiate prediction models without writing a single line of code.

Changes:

In addition to minor bug fixes, this release includes:

  • A new learning algorithm that enable (for the first time in KeLP) to deal with sequences labeling problems! It is based on a Markovian formulation within a SVM framework. Most noticeably: this new meta-algorithm for sequence learning can deal both with linear algorithms and with kernel-based algorithms!

  • A new cache (SimpleDynamicKernelCache) has been added to avoid the need of specifying the number of expected items in the dataset. It is not specialized for any learning algorithm, so it is not the most efficient cache, but it is very easy to use.

Furthermore we also released a brand new web site www.kelp-ml.org, where you can find several tutorials and documentation about KeLP!

Check out this new version from our repositories. API Javadoc is already available. Your suggestions will be very precious for us, so download and try KeLP 2.2.0!


Showing Items 61-70 of 638 on page 7 of 64: First Previous 2 3 4 5 6 7 8 9 10 11 12 Next Last