Compressive system identification: Sequential methods and entropy bounds

Avishy Y. Carmi

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

In the first part of this work, a novel Kalman filtering-based method is introduced for estimating the coefficients of sparse, or more broadly, compressible autoregressive models using fewer observations than normally required. By virtue of its (unscented) Kalman filter mechanism, the derived method essentially addresses the main difficulties attributed to the underlying estimation problem. In particular, it facilitates sequential processing of observations and is shown to attain a good recovery performance, particularly under substantial deviations from ideal conditions, those which are assumed to hold true by the theory of compressive sensing. In the remaining part of this paper we derive a few information-theoretic bounds pertaining to the problem at hand. The obtained bounds establish the relation between the complexity of the autoregressive process and the attainable estimation accuracy through the use of a novel measure of complexity. This measure is used in this work as a substitute to the generally incomputable restricted isometric property.

Original languageEnglish
Pages (from-to)751-770
Number of pages20
JournalDigital Signal Processing: A Review Journal
Volume23
Issue number3
DOIs
StatePublished - 1 Jan 2013
Externally publishedYes

Keywords

  • Autoregressive processes
  • Complex systems
  • Compressive sensing
  • Differential entropy
  • Kalman filtering
  • Non-RIP
  • Sensing complexity
  • System identification
  • Time-series
  • Variable selection

ASJC Scopus subject areas

  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Statistics, Probability and Uncertainty
  • Computational Theory and Mathematics
  • Artificial Intelligence
  • Applied Mathematics
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Compressive system identification: Sequential methods and entropy bounds'. Together they form a unique fingerprint.

Cite this