Kernelization: Theory of parameterized preprocessing

Fedor V. Fomin, Daniel Lokshtanov, Saket Saurabh, Meirav Zehavi

Research output: Book/ReportBookpeer-review

233 Scopus citations

Abstract

Preprocessing, or data reduction, is a standard technique for simplifying and speeding up computation. Written by a team of experts in the field, this book introduces a rapidly developing area of preprocessing analysis known as kernelization. The authors provide an overview of basic methods and important results, with accessible explanations of the most recent advances in the area, such as meta-kernelization, representative sets, polynomial lower bounds, and lossy kernelization. The text is divided into four parts, which cover the different theoretical aspects of the area: upper bounds, meta-theorems, lower bounds, and beyond kernelization. The methods are demonstrated through extensive examples using a single data set. Written to be self-contained, the book only requires a basic background in algorithmics and will be of use to professionals, researchers and graduate students in theoretical computer science, optimization, combinatorics, and related fields.

Original languageEnglish
PublisherCambridge University Press
Number of pages516
ISBN (Electronic)9781107415157
ISBN (Print)9781107057760
DOIs
StatePublished - 1 Jan 2019

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'Kernelization: Theory of parameterized preprocessing'. Together they form a unique fingerprint.

Cite this