Introduction to Coding and Information Theory
This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Ar(n assume average codeword length binary code binary Golay code binary strings binary symmetric channel bits code of length codewords of length coefficients columns Compute consider construct correct decoding coset leaders cyclic code d)-code decision rule decoding error Definition Let denoted elements entropy error correction error string Example exercise Golay code Hamming code Hence Hp(h Huffman encoding Huffman encoding scheme instance instantaneous code Kraft's inequality Latin squares left standard form Lemma linear code linear combination linearly independent minimum distance modulo nearest neighbor decoding nodes Noiseless Coding Theorem nonzero orthogonal pair of MOLS parameters parity check matrix perfect codes prefix property prime power probability distribution Proof Prove r-ary received correctly received word Reed-Muller codes repetition code result rows of G sample space sent Show source string source symbol sphere sphere-packing bound sphere-packing condition standard array strings of length subset subspace Suppose syndrome uniquely decipherable zero