Projects that are tagged with tree.


Logo JMLR MLPACK 1.0.11

by rcurtin - December 11, 2014, 18:20:35 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 35674 views, 6999 downloads, 6 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 1 vote)

About: A scalable, fast C++ machine learning library, with emphasis on usability.

Changes:
  • Proper handling of dimension calculation in PCA.
  • Load parameter vectors properly for LinearRegression models.
  • Linker fixes for AugLagrangian specializations under Visual Studio.
  • Add support for observation weights to LinearRegression.
  • MahalanobisDistance<> now takes root of the distance by default and therefore satisfies the triangle inequality (TakeRoot now defaults to true).
  • Better handling of optional Armadillo HDF5 dependency.
  • Fixes for numerous intermittent test failures.
  • math::RandomSeed() now sets the seed for recent (>= 3.930) Armadillo versions.
  • Handle Newton method convergence better for SparseCoding::OptimizeDictionary() and make maximum iterations a parameter.
  • Known bug: CosineTree construction may fail in some cases on i386 systems (376).

Logo XGBoost v0.3.0

by crowwork - September 2, 2014, 02:43:31 CET [ Project Homepage BibTeX Download ] 3291 views, 629 downloads, 2 subscriptions

About: xgboost: eXtreme Gradient Boosting It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithm. The package can automatically do parallel computation with OpenMP, and it can be more than 10 times faster than existing gradient boosting packages such as gbm or sklearn.GBM . It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that user are also allowed to define there own objectives easily.

Changes:

New features: - R support that is now on CRAN

  • Faster tree construction module

  • Support for boosting from initial predictions

  • Linear booster is now parallelized, using parallel coordinated descent.