OREGON STATE UNIVERSITY

You are here

Bootstrap Methods for the Cost-Sensitive Evaluation of Classifiers

TitleBootstrap Methods for the Cost-Sensitive Evaluation of Classifiers
Publication TypeConference Paper
Year of Publication2000
AuthorsMargineantu, D. D., and T. G. Dietterich
Conference NameProceedings of the Seventeenth International Conference on Machine Learning
Pagination583–590
Date Published07/2000
PublisherMorgan Kaufmann Publishers Inc.
Conference LocationSan Francisco, CA
ISBN Number1-55860-707-2
Abstract

Many machine learning applications require classifiers that minimize an asymmetric cost function rather than the misclassification rate, and several recent papers have addressed this problem. However, these papers have either applied no statistical testing or have applied statistical methods that are not appropriate for the cost-sensitive setting. Without good statistical methods, it is difficult to tell whether these new cost-sensitive methods are better than existing methods that ignore costs, and it is also difficult to tell whether one cost-sensitive method is better than another. To rectify this problem, this paper presents two statistical methods for the cost-sensitive setting. The first constructs a confidence interval for the expected cost of a single classifier. The second constructs a confidence interval for the expected difference in costs of two classifiers. In both cases, the basic idea is to separate the problem of estimating the probabilities of each...

URLhttp://dl.acm.org/citation.cfm?id=645529.657951