A Student's Guide to Coding and Information Theory
Cambridge University Press, Jan 26, 2012 - Technology & Engineering - 191 pages
This easy-to-read guide provides a concise introduction to the engineering background of modern communication systems, from mobile phones to data compression and storage. Background mathematics and specific engineering techniques are kept to a minimum so that only a basic knowledge of high-school mathematics is needed to understand the material covered. The authors begin with many practical applications in coding, including the repetition code, the Hamming code and the Huffman code. They then explain the corresponding information theory, from entropy and mutual information to channel capacity and the information transmission theorem. Finally, they provide insights into the connections between coding theory and other fields. Many worked examples are given throughout the book, using practical applications to illustrate theoretical definitions. Exercises are also included, enabling readers to double-check what they have learned and gain glimpses into more advanced topics, making this perfect for anyone who needs a quick introduction to the subject.
What people are saying - Write a review
We haven't found any reviews in the usual places.
PoNing Chen Chapter
Repetition and Hamming codes
efficient coding of a random message
Mutual information and channel capacity
Approaching the Shannon limit by turbo coding
Other editions - View all
algorithm alphabet amount of information average codeword length binary code binary prefix-free code binary space binary symmetric channel binary tree bits/second channel capacity Channel Coding channel input channel output Chapter code bits coding schemes Coding Theorem communication compressor compute corresponding corrupted Crep defined definition denotes detected encoder equations error-correcting codes Euclidean example Exercise Fano code function Gaussian given Hamming code Hamming distance Hence Huffman code Inequality information bits information theory input symbol leaves Lemma lines log2 mathematical matrix mutual information node noise noisy Note optimal code output sequence parity-check parity-check matrix player points possible Pr[U prefix-free code projective geometry PX xi random message read-out repetition code root Shannon limit shown in Figure source symbols strategy Table three-times repetition code tion transmitted tree with probabilities turbo code uniquely decodable code unused leaf zero