A Bayes consistent 1-NN classifier

Research output: Contribution to journalConference articlepeer-review

15 Scopus citations

Abstract

We show that a simple modification of the 1-nearest neighbor classifier yields a strongly Bayes consistent learner. Prior to this work, the only strongly Bayes consistent proximity-based method was the k-nearest neighbor classifier, for k growing appropriately with sample size. We will argue that a margin-regularized 1-NN enjoys considerable statistical and algorithmic advantages over the k-NN classifier. These include user-friendly finite-sample error bounds, as well as time-and memory-efficient learning and test-point evaluation algorithms with a principled speed-accuracy tradeoff. Encouraging empirical results are reported.

Original languageEnglish
Pages (from-to)480-488
Number of pages9
JournalJournal of Machine Learning Research
Volume38
StatePublished - 1 Jan 2015
Event18th International Conference on Artificial Intelligence and Statistics, AISTATS 2015 - San Diego, United States
Duration: 9 May 201512 May 2015

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'A Bayes consistent 1-NN classifier'. Together they form a unique fingerprint.

Cite this