Predicting and optimizing classifier utility with the power law

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

9 Scopus citations

Abstract

When data collection is costly and/or takes a significant amount of time, an early prediction of the classifier performance is extremely important for the design of the data mining process. Power law has been shown in the past to be a good predictor of decision-tree error rates as a function of the sample size. In this paper, we show that the optimal training set size for a given dataset can be computed from a learning curve characterized by a power law. Such a curve can be approximated using a small subset of potentially available data and then used to estimate the expected trade-off between the error rate and the amount of additional observations. The proposed approach to projected optimization of classifier utility is demonstrated and evaluated on several benchmark datasets.

Original languageEnglish
Title of host publicationICDM Workshops 2007 - Proceedings of the 17th IEEE International Conference on Data Mining Workshops
Pages219-224
Number of pages6
DOIs
StatePublished - 1 Dec 2007
Event17th IEEE International Conference on Data Mining Workshops, ICDM Workshops 2007 - Omaha, NE, United States
Duration: 28 Oct 200731 Oct 2007

Publication series

NameProceedings - IEEE International Conference on Data Mining, ICDM
ISSN (Print)1550-4786

Conference

Conference17th IEEE International Conference on Data Mining Workshops, ICDM Workshops 2007
Country/TerritoryUnited States
CityOmaha, NE
Period28/10/0731/10/07

Fingerprint

Dive into the research topics of 'Predicting and optimizing classifier utility with the power law'. Together they form a unique fingerprint.

Cite this