Abstract
Recent advances in large-margin classification of data residing in general metric spaces (rather than Hilbert spaces) enable classification under various natural metrics, such as edit and earthmover distance. The general framework developed for this purpose by von Luxburg and Bousquet [JMLR, 2004] left open the question of computational efficiency and providing direct bounds on classification error. We design a new algorithm for classification in general metric spaces, whose runtime and accuracy depend on the doubling dimension of the data points. It thus achieves superior classification performance in many common scenarios. The algorithmic core of our approach is an approximate (rather than exact) solution to the classical problems of Lipschitz extension and of Nearest Neighbor Search. The algorithm's generalization performance is established via the fat-shattering dimension of Lipschitz classifiers.
Original language | English |
---|---|
Title of host publication | COLT 2010 - The 23rd Conference on Learning Theory |
Editors | A. Kalai, M. Mohri |
Publisher | Omnipress |
Pages | 433-440 |
Number of pages | 8 |
ISBN (Print) | 9780982252925 |
State | Published - 2010 |
Event | 23rd Conference on Learning Theory, COLT 2010 - Haifa, Israel Duration: 27 Jun 2010 → 29 Jun 2010 |
Conference
Conference | 23rd Conference on Learning Theory, COLT 2010 |
---|---|
Country/Territory | Israel |
City | Haifa |
Period | 27/06/10 → 29/06/10 |
ASJC Scopus subject areas
- Education