A new perspective on convex relaxations of sparse SVM

Noam Goldberg, Sven Leyffer, Todd Munsonz

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

This paper proposes a convex relaxation of a sparse support vector machine (SVM) based on the perspective relaxation of mixed-integer nonlinear programs. We seek to minimize the zero-norm of the hyperplane normal vector with a standard SVM hinge-loss penalty and extend our approach to a zero-one loss penalty. The relaxation that we propose is a second-order cone formulation that can be efficiently solved by standard conic optimization solvers. We compare the optimization properties and classification performance of the second-order cone formulation with previous sparse SVM formulations suggested in the literature.

Original languageEnglish
Title of host publicationProceedings of the 2013 SIAM International Conference on Data Mining, SDM 2013
EditorsJoydeep Ghosh, Zoran Obradovic, Jennifer Dy, Zhi-Hua Zhou, Chandrika Kamath, Srinivasan Parthasarathy
PublisherSiam Society
Pages450-457
Number of pages8
ISBN (Electronic)9781611972627
DOIs
StatePublished - 1 Jan 2013
Externally publishedYes
EventSIAM International Conference on Data Mining, SDM 2013 - Austin, United States
Duration: 2 May 20134 May 2013

Publication series

NameProceedings of the 2013 SIAM International Conference on Data Mining, SDM 2013

Conference

ConferenceSIAM International Conference on Data Mining, SDM 2013
Country/TerritoryUnited States
CityAustin
Period2/05/134/05/13

ASJC Scopus subject areas

  • Computer Science Applications
  • Software
  • Theoretical Computer Science
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'A new perspective on convex relaxations of sparse SVM'. Together they form a unique fingerprint.

Cite this