Mean Estimation From One-Bit Measurements

Alon Kipnis, John C. Duchi

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

We consider the problem of estimating the mean of a symmetric log-concave distribution under the constraint that only a single bit per sample from this distribution is available to the estimator. We study the mean squared error as a function of the sample size (and hence the number of bits). We consider three settings: first, a centralized setting, where an encoder may release $n$ bits given a sample of size $n$ , and for which there is no asymptotic penalty for quantization; second, an adaptive setting in which each bit is a function of the current observation and previously recorded bits, where we show that the optimal relative efficiency compared to the sample mean is precisely the efficiency of the median; lastly, we show that in a distributed setting where each bit is only a function of a local sample, no estimator can achieve optimal efficiency uniformly over the parameter space. We additionally complement our results in the adaptive setting by showing that one round of adaptivity is sufficient to achieve optimal mean-square error.

Original languageEnglish
Pages (from-to)6276-6296
Number of pages21
JournalIEEE Transactions on Information Theory
Volume68
Issue number9
DOIs
StatePublished - 1 Sep 2022
Externally publishedYes

Keywords

  • Quantization
  • adaptive estimation
  • distributed estimation
  • minimax techniques
  • source coding

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint

Dive into the research topics of 'Mean Estimation From One-Bit Measurements'. Together they form a unique fingerprint.

Cite this