Adaptive restoration of textured images with mixed spectra using a generalized Wiener filter

Ravi Krishnamurthy, John W. Woods, Joseph M. Francos

Research output: Contribution to journalArticlepeer-review

2 Scopus citations


We consider the adaptive restoration of inhomogeneous textured images degraded by linear blur and additive white Gaussian noise (AWGN). The method consists of segmenting the image into individual homogeneous textures and restoring each texture separately. Each individual texture is assumed to be a realization of a regular, homogeneous random field that may possess deterministic components. Therefore, it cannot be directly restored by the conventional Wiener filter. A generalized Wiener filter is developed to accommodate fields with discontinuous spectral distributions and is shown to yield linear minimum mean-squared error (LMMSE) estimates for such fields. This generalized filter is interpreted as a two-channel filter wherein the deterministic component is filtered as if it were undegraded by noise while the purely indeterministic component is restored by a conventional technique. In the absence of blur, unsupervised estimation is achieved by using the expectation-maximization (EM) algorithm for the purely indeterministic component. An existing algorithm that assumes a doubly stochastic Gaussian (DSG) image model and uses simulated annealing is modified to yield good texture segmentation in the presence of noise. The estimation results obtained by our proposed unsupervised algorithm are superior to supervised conventional filtering, both visually and in terms of mean-squared error.

Original languageEnglish
Pages (from-to)333
Number of pages1
JournalIEEE Transactions on Image Processing
Issue number3
StatePublished - 1 May 1994
Externally publishedYes

ASJC Scopus subject areas

  • Software
  • Computer Graphics and Computer-Aided Design


Dive into the research topics of 'Adaptive restoration of textured images with mixed spectra using a generalized Wiener filter'. Together they form a unique fingerprint.

Cite this