Projects that are tagged with adaboost.


Logo Cognitive Foundry 3.4.0

by Baz - April 3, 2015, 08:28:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 18668 views, 3036 downloads, 2 subscriptions

About: The Cognitive Foundry is a modular Java software library of machine learning components and algorithms designed for research and applications.

Changes:
  • General:
    • Now requires Java 1.7 or higher.
    • Improved compatibility with Java 1.8 functions by removing ClonableSerializable requirement from many function-style interfaces.
  • Common Core:
    • Improved iteration speed over sparse MTJ vectors.
    • Added utility methods for more stable log(1+x), exp(1-x), log(1 - exp(x)), and log(1 + exp(x)) to LogMath.
    • Added method for creating a partial permutations to Permutation.
    • Added methods for computing standard deviation to UnivariateStatisticsUtil.
    • Added increment, decrement, and list view methods to Vector and Matrix.
    • Added shorter versions of get and set for Vector and Matrix getElement and setElement.
    • Added aliases of dot for dotProduct in VectorSpace.
    • Added utility methods for divideByNorm2 to VectorUtil.
  • Learning:
    • Added a learner for a Factorization Machine using SGD.
    • Added a iterative reporter for validation set performance.
    • Added new methods to statistical distribution classes to allow for faster sampling without boxing, in batches, or without creating extra memory.
    • Made generics for performance evaluators more permissive.
    • ParameterGradientEvaluator changed to not require input, output, and gradient types to be the same. This allows more sane gradient definitions for scalar functions.
    • Added parameter to enforce a minimum size in a leaf node for decision tree learning. It is configured through the splitting function.
    • Added ability to filter which dimensions to use in the random subspace and variance tree node splitter.
    • Added ReLU, leaky ReLU, and soft plus activation functions for neural networks.
    • Added IntegerDistribution interface for distributions over natural numbers.
    • Added a method to get the mean of a numeric distribution without boxing.
    • Fixed an issue in DefaultDataDistribution that caused the total to be off when a value was set to less than or equal to 0.
    • Added property for rate to GammaDistribution.
    • Added method to get standard deviation from a UnivariateGaussian.
    • Added clone operations for decision tree classes.
    • Fixed issue TukeyKramerConfidence interval computation.
    • Fixed serialization issue with SMO output.

Logo JMLR MultiBoost 1.2.02

by busarobi - March 31, 2014, 16:13:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 27323 views, 4747 downloads, 1 subscription

About: MultiBoost is a multi-purpose boosting package implemented in C++. It is based on the multi-class/multi-task AdaBoost.MH algorithm [Schapire-Singer, 1999]. Basic base learners (stumps, trees, products, Haar filters for image processing) can be easily complemented by new data representations and the corresponding base learners, without interfering with the main boosting engine.

Changes:

Major changes :

  • The “early stopping” feature can now based on any metric output with the --outputinfo command line argument.

  • Early stopping now works with --slowresume command line argument.

Minor fixes:

  • More informative output when testing.

  • Various compilation glitch with recent clang (OsX/Linux).


Logo boostree 0.1

by xavierc - December 1, 2007, 03:16:14 CET [ BibTeX Download ] 4089 views, 1435 downloads, 0 comments, 0 subscriptions

About: This package provides an implementation Schapire and Singer's AdaBoost.MH for multi-label classification. As a main feature, the package provides decision-tree weak learning, a generalization of [...]

Changes:

Initial Announcement on mloss.org.