Project details for XGBoost

Logo XGBoost v0.3.0

by crowwork - September 2, 2014, 02:43:31 CET [ Project Homepage BibTeX Download ]

view (14 today), download ( 3 today ), 2 subscriptions

Description:

xgboost: eXtreme Gradient Boosting

It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithm. The package can automatically do parallel computation with OpenMP, and it can be more than 10 times faster than existing gradient boosting packages such as gbm or sklearn.GBM . It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that user are also allowed to define there own objectives easily.

Changes to previous version:

New features: - R support that is now on CRAN

  • Faster tree construction module

  • Support for boosting from initial predictions

  • Linear booster is now parallelized, using parallel coordinated descent.

BibTeX Entry: Download
URL: Project Homepage
Supported Operating Systems: Linux, Windows, Mac Os X
Data Formats: R, Numpy, Libsvm
Tags: Parallel, Gradient Boosting, Tree, Ensemble Learning
Archive: download here

Other available revisons

Version Changelog Date
v0.3.0

New features: - R support that is now on CRAN

  • Faster tree construction module

  • Support for boosting from initial predictions

  • Linear booster is now parallelized, using parallel coordinated descent.

September 2, 2014, 02:43:31
v0.2

New features: - Python interface - New objectives: weighted training, pairwise rank, multiclass softmax - Comes with example script on Kaggle Higgs competition, 20 times faster than skilearn's GBRT

May 17, 2014, 07:27:59
v0.1

Initial Announcement on mloss.org.

March 27, 2014, 07:09:52

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.