Project details for Calibrated AdaMEC

Logo Calibrated AdaMEC 1.0

by nnikolaou - April 8, 2017, 13:57:45 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ]

view (1 today), download ( 0 today ), 0 subscriptions

Description:

AdaBoost is a successful and popular classification method, but it is not geared towards solving cost-sensitive classification problems, i.e. problems where the costs of different types of erroneous predictions are unequal. In our 2016 paper cited below, we reviewed all cost-sensitive variants of AdaBoost in the literature, along with our own adaptations. Below we provide code for the method that achieves the best empirical results without any need for parameter tuning, while satisfying all desirable theoretical properties. The method, 'Calibrated AdaMEC', is described in detail and motivated in the paper:

Cost-sensitive boosting algorithms: Do we really need them? Nikolaos Nikolaou, Narayanan U. Edakunni, Meelis Kull, Peter A. Flach, Gavin Brown, Machine Learning, 104(2), pages 359-384, 2016.

If you make use of the code found here, please cite the above paper.


Example of use:

The following example showcases how to train and generate scores and predictions under Calibrated AdaMEC. The syntax follows the conventions of the AdaBoost implementation of scikit-learn.

from sklearn.ensemble import AdaBoostClassifier

from CalibratedAdaMEC import CalibratedAdaMECClassifier # Our calibrated AdaMEC implementation can be found here

The code below assumes the user already split the binary classification data (classes denoted 0,1) into training and test sets, that they defined the cost of a false positive C_FP & the cost of a false negative C_FN and selected the weak learner base_estimator and the ensemble size n_estimators

Create and train an AdaBoostClassifier:

AdaBoost = AdaBoostClassifier(base_estimator, n_estimators)

AdaBoost = AdaBoost.fit(X_train, y_train)

Create and train a CalibratedAdaMECClassifier --being cost-sensitive, it takes C_FP & C_FN as arguments:

CalAdaMEC = CalibratedAdaMECClassifier(base_estimator, n_estimators, C_FP, C_FN)

CalAdaMEC = CalAdaMEC.fit(X_train, y_train)

Produce AdaBoost & Calibrated AdaMEC classifications:

labels_AdaBoost = AdaBoost.predict(X_test)

labels_CalibratedAdaMEC = CalAdaMEC.predict(X_test)

Produce AdaBoost & Calibrated AdaMEC scores (probability estimates) - keep only positive class scores:

scores_AdaBoost = AdaBoost.predict_proba(X_test)[:,1]

scores_CalibratedAdaMEC = CalAdaMEC.predict_proba(X_test)[:,1]


Examples of comparison to AdaBoost:

1)Probability Estimation

You can evaluate the two algorithms in terms of probability estimation using the Brier score, or the log-loss, found e.g. in the metrics module of scikit-learn. You will see that Calibrated AdaMEC achieves lower scores for both (better probability estimation).

from sklearn import metrics

brier_score_AdaBoost = metrics.brier_score_loss(y_test, scores_AdaBoost)

brier_score_CalibratedAdaMEC = metrics.brier_score_loss(y_test, scores_CalibratedAdaMEC)

log_loss_AdaBoost = metrics.log_loss(y_test, scores_AdaBoost)

log_loss_CalibratedAdaMEC = metrics.log_loss(y_test, scores_CalibratedAdaMEC)

2)Cost-sensitive Classification

You can evaluate the cost-sensitive behaviour of the classifications produced by the two algorithms in terms of total cost sensitive loss (empirical risk),as shown below. In expectation, the misclassification cost should be lower for Calibrated AdaMEC on asymmetric problems (the greater the skew, the greater the performance gain of Calibrated AdaMEC over AdaBoost).

from sklearn import metrics

Pos = sum(y_train[np.where(y_train == 1)]) #Number of positive training examples

Neg = len(y_train) - Pos #Number of negative training examples

skew = C_FP*Neg / (C_FNPos + C_FPNeg) #Skew (combined asymmetry due to both cost and class imbalance)

conf_mat_AdaBoost = metrics.confusion_matrix(y_test, labels_AdaBoost)#Confusion matrix

cost_AdaBoost = conf_mat_AdaBoost[0,1]skew + conf_mat_AdaBoost[1,0](1-skew)#Skew-Sensitive Cost

conf_mat_CalibratedAdaMEC = metrics.confusion_matrix(y_test, labels_CalibratedAdaMEC)#Confusion matrix

cost_CalibratedAdaMEC = conf_mat_CalibratedAdaMEC[0,1]skew + conf_mat_CalibratedAdaMEC[1,0](1-skew)#Skew-Sensitive Cost


Looking For a more flexible implementation?

The code given here is using Platt scaling (logistic sigmoid calibration) and a 50%-50% train-calibration split. The user is also restricted to using the discrete version of AdaBoost.

Go to https://github.com/nnikolaou/Cost-sensitive-Boosting-Tutorial for an extended ipython tutorial, providing a summary of the paper and interactive code allowing you to reproduce our experiments and run your own ones, every aspect of which (problem setup, calibration options, ensemble parameters, base learner parameters, evaluation measures) can be modified.

Changes to previous version:

Updated license information

BibTeX Entry: Download
Corresponding Paper BibTeX Entry: Download
Supported Operating Systems: Platform Independent, Platform Agnostic
Data Formats: Binary
Tags: Ensembles, Adaboost, Boosting, Ensemble Of Classifiers, Ensemble Methods, Ensemble Learning, Ensemble Model, Calibration, Class Imbalance, Cost Sensitive, Minimum Expected Cost, Risk Minimization
Archive: download here

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.