Revisiting Example Dependent Cost-Sensitive Learning with Decision Trees

Oisin Mac Aodha, Gabriel J. Brostow; Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2013, pp. 193-200

Abstract


Typical approaches to classification treat class labels as disjoint. For each training example, it is assumed that there is only one class label that correctly describes it, and that all other labels are equally bad. We know however, that good and bad labels are too simplistic in many scenarios, hurting accuracy. In the realm of example dependent costsensitive learning, each label is instead a vector representing a data point's affinity for each of the classes. At test time, our goal is not to minimize the misclassification rate, but to maximize that affinity. We propose a novel example dependent cost-sensitive impurity measure for decision trees. Our experiments show that this new impurity measure improves test performance while still retaining the fast test times of standard classification trees. We compare our approach to classification trees and other cost-sensitive methods on three computer vision problems, tracking, descriptor matching, and optical flow, and show improvements in all three domains.

Related Material


[pdf]
[bibtex]
@InProceedings{Aodha_2013_ICCV,
author = {Mac Aodha, Oisin and Brostow, Gabriel J.},
title = {Revisiting Example Dependent Cost-Sensitive Learning with Decision Trees},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
month = {December},
year = {2013}
}