Project details for XGBoost

Logo XGBoost v0.2

by crowwork - May 17, 2014, 07:27:59 CET [ Project Homepage BibTeX Download ]

view ( today), download ( today ), 0 subscriptions

Description:

xgboost: eXtreme Gradient Boosting

This is yet another gradient boosting (tree) (GBRT) library.

The implementation is based on C++, optimized for memory and efficiency, to try target large data with a single machine.

Palatalization is done with OpenMP. The algorithm relies on sparse feature format, which means it naturally handles missing values.

Supported key components so far:

  • Gradient boosting models:

    • regression tree (GBRT)

    • linear model/lasso

  • Objectives to support tasks:

    • regression

    • classification

    • rank

  • OpenMP implementation

Changes to previous version:

New features: - Python interface - New objectives: weighted training, pairwise rank, multiclass softmax - Comes with example script on Kaggle Higgs competition, 20 times faster than skilearn's GBRT

BibTeX Entry: Download
Supported Operating Systems: Linux, Mac Os X
Data Formats: Numpy, Libsvm
Tags: Parallel, Gradient Boosting, Tree, Ensemble Learning
Archive: download here

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.