Approximate Gács-Körner Common Information

Salman Salamatian, Asaf Cohen, Muriel Medard

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

We propose to exploit the structure of the correlation between two random variables X and Y via a relaxation on the Common Information problem of Gács and Körner (GK Common Information). Consider two correlated sources X and Y generated from a joint distribution PX,Y. We study embeddings of X into discrete random variables U, such that H(U|Y) ≤ δ, while maximizing I(X; U). When δ = 0, this reduces to the GK Common Information problem. However, unlike the GK Common Information, which is known to be zero for many pairs of random variables (X, Y), we show that this relaxation allows to capture the structure in the correlation between X and Y for a much broader range of joint distributions, and showcase applications for some problems in multi-terminal information theory.

Original languageEnglish
Title of host publication2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2234-2239
Number of pages6
ISBN (Electronic)9781728164328
DOIs
StatePublished - 1 Jun 2020
Event2020 IEEE International Symposium on Information Theory, ISIT 2020 - Los Angeles, United States
Duration: 21 Jul 202026 Jul 2020

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
Volume2020-June
ISSN (Print)2157-8095

Conference

Conference2020 IEEE International Symposium on Information Theory, ISIT 2020
Country/TerritoryUnited States
CityLos Angeles
Period21/07/2026/07/20

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Approximate Gács-Körner Common Information'. Together they form a unique fingerprint.

Cite this