Cramér-Rao Bound Under Norm Constraint

Research output: Contribution to journalArticlepeer-review

Abstract

The constrained Cramér-Rao bound (CCRB) is a benchmark for constrained parameter estimation. However, the CCRB unbiasedness conditions are too strict and thus, the CCRB may not be a lower bound for estimators under constraints. The recently developed Lehmann-unbiased-CCRB (LU-CCRB) was shown to be a lower bound for the commonly used constrained maximum likelihood (CML) estimator performance in cases where the CCRB is not. In constrained parameter estimation, the estimator is usually required to satisfy the constraints. However, the LU-CCRB is a lower bound for Lehmann-unbiased estimators that do not necessarily satisfy the constraints. In this letter, we consider the norm constraint and derive a novel bound, called norm-constrained CCRB (NC-CCRB), which is a lower bound on the mean-squared-error matrix trace of Lehmann-unbiased estimators that satisfy the norm constraint. The NC-CCRB is shown to be tighter than the LU-CCRB. In the simulations, we consider a linear estimation problem under norm constraint in which the proposed NC-CCRB better predicts the performance of the CML estimator than the CCRB trace and the LU-CCRB.
Original languageEnglish
Pages (from-to)1393-1397
Number of pages5
JournalIEEE Signal Processing Letters
Volume26
Issue number9
DOIs
StatePublished - Jun 2019

Fingerprint

Dive into the research topics of 'Cramér-Rao Bound Under Norm Constraint'. Together they form a unique fingerprint.

Cite this