Variance optimized bagging

Philip Derbeko, Ran El-Yaniv, Ron Meir

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

23 Scopus citations

Abstract

We propose and study a newt echnique for aggregating an ensemble of bootstrapped classifiers. In this method we seek a linear combination of the base-classifiers such that the weights are optimized to reduce variance. Minimum variance combinations are computed using quadratic programming. This optimization technique is borrowed from Mathematical Finance where it is called Markowitz Mean-Variance Portfolio Optimization. We test the newmetho d on a number of binary classification problems from the UCI repository using a Support Vector Machine (SVM) as the base-classifier learning algorithm. Our results indicate that the proposed technique can consistently outperform Bagging and can dramatically improve the SVM performance even in cases where the Bagging fails to improve the base-classifier.

Original languageEnglish
Title of host publicationMachine Learning
Subtitle of host publicationECML 2002 - 13th European Conference on Machine Learning, Proceedings
EditorsTapio Elomaa, Heikki Mannila, Hannu Toivonen
PublisherSpringer Verlag
Pages60-72
Number of pages13
ISBN (Print)9783540440369
DOIs
StatePublished - 1 Jan 2002
Externally publishedYes
Event13th European Conference on Machine Learning, ECML 2002 - Helsinki, Finland
Duration: 19 Aug 200223 Aug 2002

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume2430
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference13th European Conference on Machine Learning, ECML 2002
Country/TerritoryFinland
CityHelsinki
Period19/08/0223/08/02

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Variance optimized bagging'. Together they form a unique fingerprint.

Cite this