Projects that are tagged with tree.


Logo JMLR MLPACK 3.0.0

by rcurtin - March 31, 2018, 05:31:08 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 103961 views, 18655 downloads, 6 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 1 vote)

About: A fast, flexible C++ machine learning library, with bindings to other languages.

Changes:

Released March 30th, 2018.

  • Speed and memory improvements for DBSCAN. --single_mode can now be used for situations where previously RAM usage was too high.
  • Bump minimum required version of Armadillo to 6.500.0.
  • Add automatically generated Python bindings. These have the same interface as the command-line programs.
  • Add deep learning infrastructure in src/mlpack/methods/ann/.
  • Add reinforcement learning infrastructure in src/mlpack/methods/reinforcement_learning/.
  • Add optimizers: AdaGrad, CMAES, CNE, FrankeWolfe, GradientDescent, GridSearch, IQN, Katyusha, LineSearch, ParallelSGD, SARAH, SCD, SGDR, SMORMS3, SPALeRA, SVRG.
  • Add hyperparameter tuning infrastructure and cross-validation infrastructure in src/mlpack/core/cv/ and src/mlpack/core/hpt/.
  • Fix bug in mean shift.
  • Add random forests (see src/mlpack/methods/random_forest).
  • Numerous other bugfixes and testing improvements.
  • Add randomized Krylov SVD and Block Krylov SVD.

Logo XGBoost v0.4.0

by crowwork - May 12, 2015, 08:57:16 CET [ Project Homepage BibTeX Download ] 21367 views, 3776 downloads, 3 subscriptions

About: xgboost: eXtreme Gradient Boosting It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithm. The package can automatically do parallel computation with OpenMP, and it can be more than 10 times faster than existing gradient boosting packages such as gbm or sklearn.GBM . It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that user are also allowed to define there own objectives easily. The newest version of xgboost now supports distributed learning on various platforms such as hadoop, mpi and scales to even larger problems

Changes:
  • Distributed version of xgboost that runs on YARN, scales to billions of examples

  • Direct save/load data and model from/to S3 and HDFS

  • Feature importance visualization in R module, by Michael Benesty

  • Predict leaf index

  • Poisson regression for counts data

  • Early stopping option in training

  • Native save load support in R and python

  • xgboost models now can be saved using save/load in R

  • xgboost python model is now pickable

  • sklearn wrapper is supported in python module

  • Experimental External memory version