TY - JOUR
T1 - Low-complexity methods for estimation after parameter selection
AU - Harel, Nadav
AU - Routtenberg, Tirza
N1 - Funding Information:
Manuscript received March 25, 2019; revised September 29, 2019; accepted January 17, 2020. Date of publication January 29, 2020; date of current version February 14, 2020. The associate editor coordinating the review of this manuscript and approving it for publication was Dr. Abd-Krim Seghouane. This work was supported by the ISRAEL SCIENCE FOUNDATION (ISF), under Grant 1173/16. Nadav Harel has been funded by the Kreitman School of Advanced Graduate Studies. (Corresponding author: Tirza Routtenberg.) The authors are with the Department of Electrical and Computer Engineering, Ben-Gurion University of the Negev, Beer-Sheva 84105, Israel (e-mail: nadavhar@post.bgu.ac.il; tirzar@bgu.ac.il). Digital Object Identifier 10.1109/TSP.2020.2970311 Fig. 1. Two-stage estimation after parameter selection scheme: First, a parameter is selected by a known predetermined selection rule, Ψ, based on the first-stage observation vector, x. Then, an additional observation vector, y, is acquired. Finally, the selected parameter is estimated based on both observation vectors, x and y.
Publisher Copyright:
© 1991-2012 IEEE.
PY - 2020/1/1
Y1 - 2020/1/1
N2 - Statistical inference of multiple parameters often involves a preliminary parameter selection stage. The selection stage has an impact on subsequent estimation, for example by introducing a selection bias. The post-selection maximum likelihood (PSML) estimator is shown to reduce the selection bias and the post-selection mean-squared-error (PSMSE) compared with conventional estimators, such as the maximum likelihood (ML) estimator. However, the computational complexity of the PSML is usually high due to the multi-dimensional exhaustive search for a global maximum of the post-selection log-likelihood (PSLL) function. Moreover, the PSLL involves the probability of selection that, in general, does not have an analytical form. In this paper, we develop new low-complexity post-selection estimation methods for a two-stage estimation after parameter selection architecture. The methods are based on implementing the iterative maximization by parts (MBP) approach, which is based on the decomposition of the PSLL function into 'easily-optimized' and complicated parts. The proposed second-best PSML method applies the MBP-PSML algorithm with a pairwise probability of selection between the two highest-ranked parameters w.r.t. the selection rule. The proposed SA-PSML method is based on using stochastic approximation (SA) and Monte Carlo integrations to obtain a non-parametric estimation of the gradient of the probability of selection and then applying the MBP-PSML algorithm on this approximation. For low-complexity performance analysis, we develop the empirical post-selection Craḿer-Rao-type lower bound. Simulations demonstrate that the proposed post-selection estimation methods are tractable and reduce both the bias and the PSMSE, compared with the ML estimator, while only requiring moderate computational complexity.
AB - Statistical inference of multiple parameters often involves a preliminary parameter selection stage. The selection stage has an impact on subsequent estimation, for example by introducing a selection bias. The post-selection maximum likelihood (PSML) estimator is shown to reduce the selection bias and the post-selection mean-squared-error (PSMSE) compared with conventional estimators, such as the maximum likelihood (ML) estimator. However, the computational complexity of the PSML is usually high due to the multi-dimensional exhaustive search for a global maximum of the post-selection log-likelihood (PSLL) function. Moreover, the PSLL involves the probability of selection that, in general, does not have an analytical form. In this paper, we develop new low-complexity post-selection estimation methods for a two-stage estimation after parameter selection architecture. The methods are based on implementing the iterative maximization by parts (MBP) approach, which is based on the decomposition of the PSLL function into 'easily-optimized' and complicated parts. The proposed second-best PSML method applies the MBP-PSML algorithm with a pairwise probability of selection between the two highest-ranked parameters w.r.t. the selection rule. The proposed SA-PSML method is based on using stochastic approximation (SA) and Monte Carlo integrations to obtain a non-parametric estimation of the gradient of the probability of selection and then applying the MBP-PSML algorithm on this approximation. For low-complexity performance analysis, we develop the empirical post-selection Craḿer-Rao-type lower bound. Simulations demonstrate that the proposed post-selection estimation methods are tractable and reduce both the bias and the PSMSE, compared with the ML estimator, while only requiring moderate computational complexity.
KW - Estimation after parameter selection
KW - maximization by parts
KW - post-selection maximum-likelihood (PSML)
KW - selection bias
KW - stochastic approximation
UR - http://www.scopus.com/inward/record.url?scp=85079466301&partnerID=8YFLogxK
U2 - 10.1109/TSP.2020.2970311
DO - 10.1109/TSP.2020.2970311
M3 - Article
AN - SCOPUS:85079466301
SN - 1053-587X
VL - 68
SP - 1152
EP - 1167
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
M1 - 8974191
ER -