Image Enhancement

Home


Speckle Reduction with Edge-Preserving

Authors:

Pierre Mathieu, UNSA (France)
Laurent Dirat, UNSA (France)
Xavier Dupuis, UNSA (France)
Michel Barlaud, UNSA (France)

Volume 4, Page 2785

Abstract:

Coherent imagery emerges as one of the major domains in image processing and includes topics as diversified as radar, medical and surface analysis. Whatever the application, the resulting image is noisy corrupted. In coherent imaging, images suffer from speckle noise, whose main characteristic is to be multiplicative. The proposed method takes explicitly into account the multiplicative property of the noise while preserving discontinuities in the restored image. Moreover, in a second step our algorithm estimates the noise, thus the information contained in the speckle still remains usable.

ic972785.pdf

ic972785.pdf

TOP



Multiscale Contrast Enhancement of Medical Images

Authors:

Giuseppe Boccignone, DIIIE - Universita' di Salerno (Italy)
Antonio Picariello, DIS - Universita' di Napoli (Italy)

Volume 4, Page 2789

Abstract:

We presents results obtained by different contrast enhancement methods applied to medical images. We take into account classical histogram specification, local and wavelet-based techniques and a novel approach for multiscale contrast enhancement. The latter, whose rationale grounds in theories of visual perception, exploits a local definition of the Fechner-WWebers contrast within the context of a non-linear scale-space representation generated by anisotropic diffusion. Our experimental fields concerns a difficult kind of medical images, namely digital mammographic images.

ic972789.pdf

ic972789.pdf

TOP



Denoising of Electron Tomographic Reconstructions from Biological Specimen Using Multidimensional Multiscale Transforms

Authors:

Arne Stoschek, Max-Planck-Inst. for Biochemistry (Germany)
Thomas P.Y. Yu, Stanford University (U.S.A.)
Reiner Hegerl, Max-Planck-Inst. for Biochemistry (Germany)

Volume 4, Page 2793

Abstract:

In electron tomographic reconstructions of biological specimens the information about their structure is not directly accessible since most of the signal is buried in noise. An interpretation of the images using surface and volume rendering techniques is difficult due to the noise sensitivity of rendering algorithms. We explore the use of various multiscale representations for denoising 2D and 3D images. Orthogonal wavelet transforms applied to multidimensional data exhibit poor results due to the lack of translational and directional invariance. Extending the 1D translation-invariant denoising algorithm of Coifman and Donoho to higher dimensions proves to overcome the poor performance of orthogonal wavelet transforms. We present a method to quantify the loss of information due to denoising artifacts on data with an unknown signal-noise relationship, and propose a scheme for denoising of such data. Experiments show invariant wavelet denoising to perform well in reconstructing signals out of noisy 3D data while preserving most of the actual information.

ic972793.pdf

ic972793.pdf

TOP



Quantized Bi-Histogram Equalization

Authors:

Yeong-Taeg Kim, Samsung (Korea)

Volume 4, Page 2797

Abstract:

Histogram equalization is a widely used scheme for contrast enhancement in a variety of applications due to its simple function and effectiveness. One possible drawback of the histogram equalization is that it can change the mean brightness of an image significantly as a consequence of histogram flattening. As an effort to overcome such drawback for extending the applications of the histogram equalization in consumer electronic products, bi-histogram equalization has been proposed by the author which is capable of preserving the mean brightness of an image while it performs contrast enhancement. The essence of the bi-histogram equalization is to utilize independent histogram equalizations separately over two subimages obtained by decomposing the input image based on its mean. In this paper, a simplified version of the bi-histogram equalization is proposed, which will be referred to as the quantized bi-histogram equalization. The proposed algorithm provides much simple H/W structure than the bi-histogram equalization since it is based on the cumulative density function of a quantized image. Thus, the realization of bi-histogram equalization in H/W can be much feasible, which leads to versatile applications in the filed of consumer electronics.

ic972797.pdf

TOP



On description of impulse noise removal using PWL filter model

Authors:

Wenzhe Li, University of Erlangen (Germany)
Ji-Nan Lin, Thomson Multimedia, Villingen (Germany)
Rolf Unbehauen, University of Erlangen (Germany)

Volume 4, Page 2801

Abstract:

The classical method for impulsive noise removal in image signals consists of two steps, i.e., a step for impulse detection and a step for estimation of the pixels corrupted by impulsive noise. In this paper, we show that the step of impulse detection can be described very suitably by piecewise-linear (PWL) functions which explains intuitively the process of detection as a partition of the signal domain space. A PWL filter model is then proposed. In such a filter model, we can use either a linear or a nonlinear subfilter to estimate the corrupted pixels. For the latter, using a median subfilter yields a multi-level PWL filter. The filter model is simulated and its excellent result is compared to that of a median filter.

ic972801.pdf

ic972801.pdf

TOP



Identification of the Nature of Noise and Estimation of its Statistical Parameters by Analysis of Local Histograms

Authors:

Lionel Beaurepaire, Univ. Rennes I, ENSSAT (France)
Kacem Chehdi, Univ. Rennes I, ENSSAT (France)
Benoit Vozel, Univ. Rennes I, ENSSAT (France)

Volume 4, Page 2805

Abstract:

This paper deals with the problem of identifying the nature of noise and estimating its standard deviation from the observed image in order to be able to apply the most appropriate processing or analysis algorithm afterwards. In this study, we focus our attention on three classes of degraded noise images, the first one being degraded by an additive noise, the second one by a multiplicative noise and the latter by an impulsive noise. First, in order to identify the nature of the noise, we propose a new approach consisting of characterizing each class by a parameter obtained from histograms computed on several homogeneous regions of the observed image. The homogeneous regions are obtained by segmenting images. Then, the estimation of the standard deviation is achieved from the analysis of an histogram of local standard deviations computed on each of the homogeneous regions.

ic972805.pdf

ic972805.pdf

TOP



A Bayesian approach to Blind Deconvolution based on Dirichlet Distributions

Authors:

Rafael Molina, University of Granada (Spain)
Aggelos K. Katsaggelos, Northwestern University (U.S.A.)
Javier Abad, University of Granada (Spain)
Javier Mateos, University of Granada (Spain)

Volume 4, Page 2809

Abstract:

This paper deals with the simultaneous identification of the blur and the restoration of a noisy and blurred image. We propose the use of Dirichlet distributions to model our prior knowledge about the blurring function together with smoothness constraints on the restored image to solve the blind deconvolution problem. We show that the use of Dirichlet distributions offers a lot of flexibility in incorporating vague or very precise knowledge about the blurring process into the blind deconvolution process. The proposed MAP estimator offers additional flexibility in modeling the original image. Experimental results demostrate the performance of the proposed algorithm.

ic972809.pdf

ic972809.pdf

TOP