Image Compression I

Chair: R. Gray, Standford University, USA

Home


A Lossless Image Coder with Context Classification, Adaptive Prediction and Adaptive Entropy Coding

Authors:

Farshid Golchin, Griffith University (Australia)
Kuldip K. Paliwal, Griffith University (Australia)

Volume 5, Page 2545, Paper number 1279

Abstract:

In this paper, we combine a context classification scheme with adaptive prediction and entropy coding to produce an adaptive lossless image coder. In this coder, we maximize the benefits of adaptivity using both adaptive prediction and entropy coding. The adaptive prediction is closely tied with the classification of contexts within the image. Contexts are defined with respect to the local edge, texture or gradient characteristics as well as local activity within small blocks of the image. For each context an optimal predictor is found which is used for the prediction of all pixels belonging to that particular context. A Clustering algorithm is used to design an optimal entropy coding scheme for the prediction residual. The combination of these two adptive tecchniques produces some of the best lossless coding results reported so far.

ic981279.pdf (From Postscript)

TOP



A New Approach for Reducing Blockiness in DCT Image Coders

Authors:

Stephen A Martucci, Scitex Digital Video (U.S.A.)

Volume 5, Page 2549, Paper number 1307

Abstract:

This paper presents a new approach for reducing the blockiness that occurs when using DCT image coders at high compression ratios. The method is simply the replacement of the inverse DCT-2 in the decoder by a larger inverse DCT-1 followed by overlapping and averaging of the enlarged blocks to reconstruct the image. The modified decoder can decode any bitstream generated by a standard encoder. Blockiness is reduced but there is no noticeable distortion or loss of sharpness in the image. There is also no significant increase in complexity when using this method.

ic981307.pdf (From Postscript)

TOP



Stack-Run-End Compression for Low Bit Rate Color Image Communication

Authors:

Min-Jen Tsai, National Chiao-Tung University (Taiwan)

Volume 5, Page 2553, Paper number 1661

Abstract:

A new wavelet image coding algorithm was designed for color image compression in this paper. This algorithm utilizes multi-ary symbol set to represent the meaningful coefficients in the wavelet transform domain which are necessary for the image reconstruction in the respective color channel. The scheme works first by color space conversion, followed by raster scanning the individual subband for data conversion to symbol representation. Adaptive arithmetic coder is then used to compress the symbols with high efficiency. Unlike zerotree coding or its variations which are essentially the intersubband coding approach with the complexity in addressing the location relationship across the subbands, this work is a low complexity intrasubband based coding method with context specification within the subband, and termination symbol across subbands. Compared with the zerotree refined schemes, this algorithm results in competitive PSNR values andperceptually high quality images at the same compression ratio for color image compression.

ic981661.pdf (From Postscript)

TOP



DC Coefficient Restoration Using MAP Estimation Technique

Authors:

Fu-wing Tse, The Chinese University of Hong Kong (Hong Kong)
Wai-Kuen Cham, The Chinese University of Hong Kong (Hong Kong)

Volume 5, Page 2557, Paper number 1479

Abstract:

DC coefficient restoration scheme is a technique which can be used to increase the compression ability of transform image coding by not transmitting the DC coefficients but estimating them from the transmitted AC component. In the last decade, the minimum edge difference criterion is used in the scheme. However the criterion fails at the locations where the discontinuities are along the block boundaries and therefore results in observable blocking effect around these locations or higher bit rate. In this paper, we propose a new criterion using the maximum a posterior (MAP) estimation technique which preserves the discontinuities during the DC coefficients restoration and solves the blocking effects in the restored images.

ic981479.pdf (From Postscript)

TOP



Model-Based Edge Reconstruction for Low Bit-Rate Wavelet-Based Image Coding

Authors:

G.L. Fan, The Chinese University of Hong Kong (Hong Kong)
Wai-Kuen Cham, The Chinese University of Hong Kong (Hong Kong)
J.Z. Liu, The Chinese University of Hong Kong (Hong Kong)

Volume 5, Page 2561, Paper number 1294

Abstract:

Low bit-rate image coding brings about obvious degradation to the compressed images, among which distortions at edges are particular objectionable. In this paper, a model-based edge reconstruction algorithm is proposed for wavelet-based image coding at low bit-rate. Our approach applies a general model to represent varieties of edges existing in an image. Based on this model, the problem of edge reconstruction is formulated as finding original edge model parameters from the lossy image. The proposed method is able to improve the subjective visual quality and fidelity (PSNR) of images coded by wavelet-based coding using zerotree quantization. Our algorithm can also be adapted to other wavelet-based coding methods which have the same quantization results as the zerotree quantization. Experimental results show that it performs well for most images with notable structures. Our approach is promising in stretching the performance of wavelet-based coding at low bit-rate. More demonstrations of edge reconstruction results can be found at http://www.ee.cuhk.edu.hk/~glfan/icassp98.html

ic981294.pdf (From Postscript)

TOP



Compression Algorithms for Classification of Remotely Sensed Images

Authors:

Frank Tintrup, University of Cagliari (Italy)
Francesco G.B. De Natale, University of Cagliari (Italy)
Daniele D. Giusto, University of Cagliari (Italy)

Volume 5, Page 2565, Paper number 2484

Abstract:

The paper presents a comparison of the principal lossy compression algorithms, Vector Quantization (VQ), JPEG and Wavelets (WV) posterior KLT applied to multispectral remotely sensed images and evaluated by the classification algorithm K-NN. The main goal of the compression of remotely sensed images is a reduction of the huge requirements for downlink and storage. The Karhunen Loeve Transform first removes the interband correlation to produce the principal components of the image which are then compressed by the principal algorithms. The quality evaluation was done by a supervised classification with the well known algorithm K-NN for remote sensing applications and the MSE for visual aspects. The obtained results of these accurate and particular analysis of the current compression techniques are quite surprisingly compared to other recent works.

ic982484.pdf (From Postscript)

TOP



Locally-adaptive Image Coding based on a Perceptual Target Distortion

Authors:

Ingo Hontsch, Arizona State University (U.S.A.)
Lina J Karam, Arizona State University (U.S.A.)

Volume 5, Page 2569, Paper number 2532

Abstract:

This paper presents a perceptual-based image coder, which discriminates between image components based on their perceptual relevance for achieving increased performance in terms of quality and bit-rate. The new coder uses a locally-adaptive perceptual quantization scheme based on a tractable perceptual distortion metric. Our strategy is to exploit human visual masking properties by deriving visual masking thresholds in a locally-adaptive fashion. The derived masking thresholds are used in controlling the quantization stage by adapting the quantizerreconstruction levels in order to meet a desired target perceptual distortion. The proposed coding scheme is flexible in that it works with any subband-based decomposition and withblock-based transform methods. Compared to the existing perceptual transform-based and block-based methods, the proposed perceptual coding method exhibits superior performance in terms of bit rate and distortion control. Coding results are presented to illustrate the performance of the presented coding scheme.

ic982532.pdf (From Postscript)

TOP



A Non Uniform Segmentation Optimal Hybrid Fractal/DCT Image Compression Algorithm

Authors:

Gerry Melnikov, Northwestern University (U.S.A.)
Aggelos K Katsaggelos, Northwestern University (U.S.A.)

Volume 5, Page 2573, Paper number 2564

Abstract:

In this paper a hybrid fractal and Discrete Cosine Transform (DCT) coder is developed. Drawing on the ability of DCT to remove inter-pixel redundancies and on the ability of fractal transforms to capitalize on long-range correlations in the image, the hybrid coder performs an optimal, in the Rate-Distortion sense, bit allocation among coding parameters. An orthogonal basis framework is used within which an image segmentation and a hybrid block-based transform are selected jointly. A Lagrangian multiplier approach is used to optimize the hybrid parameters and the segmentation. Differential encoding of the DC coefficient is employed, with the scanning path based on a 3rd -order Hilbert curve. Simulation results show a significant improvement in quality with respect to the JPEG standard.

ic982564.pdf (From Postscript)

TOP



Orthogonal Subspace Projection Filtering for Stereo Image Compression

Authors:

Sang-Hoon Seo, Colorado State University (U.S.A.)
Mahmood R. Azimi-Sadjadi, Colorado State University (U.S.A.)

Volume 5, Page 2577, Paper number 2354

Abstract:

This paper presents a 2-D filtering scheme for stereo image compression using orthogonal subspace projection. To provide more candidate blocks for input data, the support region for input data is extended in the reference image. In addition, edge blocks are added to the candidate input blocks in order to provide better compensation ability for edges and boundaries of objects. The best blocks for input data are seleceted one by one in order of importance to reconstruct the desired block using Gram-Schmidt orthogonalization algorithm. Simulation results exhibit excellent performance of the proposed scheme when compared to those of the standard block-matching and least-squares(LS)-based 2-D filtering schemes.

ic982354.pdf (From Postscript)

TOP



Finding a Suitable Wavelet for Image Compression Applications

Authors:

Shahid Masud, The Queens University of Belfast (Northern Ireland)
John V. McCanny, The Queens University of Belfast (Northern Ireland)

Volume 5, Page 2581, Paper number 1056

Abstract:

In this paper we assess the relative merits of various types of wavelet functions for use in a wide range of image compression scenarios. We have delineated different algorithmic criteria that can be used for wavelet evaluation. The assessment undertaken includes both algorithmic aspects (fidelity, perceptual quality) as well as suitability for real time implementation in hardware. The results obtained indicate that of the wavelets studied the biorthogonal 9&7 taps wavelet is most suitable from a compression perspective and that the Daubechies 8 taps gives best performance when assessed solely in terms of statistical measures.

ic981056.pdf (From Postscript)

TOP