Lossy Image Compression: Domain Decomposition-Based Algorithms

Front Cover
Springer Science & Business Media, Aug 28, 2011 - Computers - 89 pages

Good quality digital images have high storage and bandwidth requirements. In modern times, with increasing user expectation for image quality, efficient compression is necessary to keep memory and transmission time within reasonable limits.

Image compression is concerned with minimization of the number of information carrying units used to represent an image. Lossy compression techniques incur some loss of information which is usually imperceptible. In return for accepting this distortion, we obtain much higher compression ratios than is possible with lossless compression.

Salient features of this book include: four new image compression algorithms and implementation of these algorithms; detailed discussion of fuzzy geometry measures and their application in image compression algorithms; new domain decomposition based algorithms using image quality measures and study of various quality measures for gray scale image compression; compression algorithms for different parallel architectures and evaluation of time complexity for encoding on all architectures; parallel implementation of image compression algorithms on a cluster in Parallel Virtual Machine (PVM) environment.

This book will be of interest to graduate students, researchers and practicing engineers looking for new image compression techniques that provide good perceived quality in digital images with higher compression ratios than is possible with conventional algorithms.


What people are saying - Write a review

We haven't found any reviews in the usual places.


1 Introduction
2 Tree Triangular Coding Image Compression Algorithms
3 Image Compression Using Quality Measures
4 Parallel Image Compression Algorithms
5 Conclusions and Future Directions

Other editions - View all

Common terms and phrases

Bibliographic information