We develop a general framework for margin- based multicategory classification in metric spaces. The basic work-horse is a margin- regularized version of the nearest-neighbor classifier. We prove generalization bounds that match the state of the art in sample size n and significantly improve the dependence on the number of classes κ. Our point of departure is a nearly Bayes-optimal finite-sample risk bound independent of κ. Although κ-free, this bound is un- regularized and non-adaptive, which motivates our main result: Rademacher and scale-sensitive margin bounds with a logarithmic dependence on κ. As the best previous risk estimates in this setting were of order √κ, our bound is exponentially sharper. From the algorithmic standpoint, in doubling metric spaces our classifier may be trained on n examples in CJ(n2 log n) time and evaluated on new points in 0(log n) time.