On Rate Distortion via Constrained Optimization of Estimated Mutual Information

Dor Tsur, Bashar Huleihel, Haim H. Permuter

Research output: Contribution to journalArticlepeer-review

Abstract

We propose a new methodology for the estimation of the rate distortion function (RDF), considering both continuous and discrete reconstruction spaces. The approach is input-space agnostic and does not require prior knowledge of the source distribution, nor the distortion function. Thus, our method is a general solution to the RDF estimation problem, while existing works focus on a specific domain. The approach leverages neural estimation and constrained optimization of mutual information to optimize a generative model of the input distribution. In continuous spaces we learn a sample generating model, while a probability mass function model is proposed for discrete spaces. Formal guarantees of the proposed method are explored and implementation details are discussed. We demonstrate our method's superior performance on both high dimensional and large alphabet synthetic data. In contrast to existing works, our estimator readily adapts to the rate distortion perception framework, which is central to contemporary compression tasks. Consequently, our method strengthens the connection between information theory and machine learning, proposing new solutions to the problem of lossy compression.

Original languageEnglish
JournalIEEE Access
DOIs
StateAccepted/In press - 1 Jan 2024

Keywords

  • Alternating optimization
  • MINE
  • generative modeling
  • mutual information
  • neural distribution transformer
  • neural estimation
  • rate distortion
  • rate distortion perception

ASJC Scopus subject areas

  • General Computer Science
  • General Materials Science
  • General Engineering

Fingerprint

Dive into the research topics of 'On Rate Distortion via Constrained Optimization of Estimated Mutual Information'. Together they form a unique fingerprint.

Cite this