Minimax estimation of the mean of a general distribution when the parameter space is restricted

Avraham A Melkman, Ya'acov Ritov

Research output: Contribution to journalArticlepeer-review


We consider the one-dimensional additive model Y = a + X. If X is a (standard) normal random variable and O is completely unknown then of course 8(y) = y is the minimax estimator. This same estimator is no longer minimax, however, given the added prior information 1i91 < s. In fact, the
minimax estimate is then Bayes with respect to a least favorable prior distribution that is supported on [-s, s]. This distribution was investigated by Casella and Strawderman (1981) for small values of s, and by Bickel (1981) and Levit (1980a-c) and Levit and Berhin (1980) for large values of s. Our interest was captured particularly by Bickel's somewhat surprising result that if the least favorable distributions are rescaled to [-1,1] then they converge weakly, as s -> s, to a distribution with density cos2(7rx/2) [the distribution with minimum Fisher information among all those supported on [-1,1], see Huber (1974)], and the corresponding minimax risks behave like 1 - T 2/S2 + o(1/s2).
Moreover, Bickel produced a family of estimates that have this risk asymptotically, and proved that they have the property that s(y - 8(y)) is approximately ST tan(7ry/(2s)).
Original languageEnglish GB
Pages (from-to)432-442
Number of pages11
JournalAnnals of Statistics
Issue number1
StatePublished - 1987


Dive into the research topics of 'Minimax estimation of the mean of a general distribution when the parameter space is restricted'. Together they form a unique fingerprint.

Cite this