Sparse weighted voting classifier selection and its linear programming relaxations

Noam Goldberg, Jonathan Eckstein

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

We consider the problem of minimizing the number of misclassifications of a weighted voting classifier, plus a penalty proportional to the number of nonzero weights. We first prove that its optimum is at least as hard to approximate as the minimum disagreement halfspace problem for a wide range of penalty parameter values. After formulating the problem as a mixed integer program (MIP), we show that common "soft margin" linear programming (LP) formulations for constructing weighted voting classsifiers are equivalent to an LP relaxation of our formulation. We show that this relaxation is very weak, with a potentially exponential integrality gap. However, we also show that augmenting the relaxation with certain valid inequalities tightens it considerably, yielding a linear upper bound on the gap for all values of the penalty parameter that exceed a reasonable threshold. Unlike earlier techniques proposed for similar problems (Bradley and Mangasarian (1998) [4], Weston et al. (2003) [14]), our approach provides bounds on the optimal solution value.

Original languageEnglish
Pages (from-to)481-486
Number of pages6
JournalInformation Processing Letters
Volume112
Issue number12
DOIs
StatePublished - 30 Jun 2012
Externally publishedYes

Keywords

  • Computational complexity
  • Hardness of approximation
  • Integrality gap
  • Machine learning
  • Sparsity
  • Weighted voting classification

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Signal Processing
  • Information Systems
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Sparse weighted voting classifier selection and its linear programming relaxations'. Together they form a unique fingerprint.

Cite this