Coding and Information Theory
Focusing on both theory and practical applications, this volume combines in a natural way the two major aspects of information representation--representation for storage (coding theory) and representation for transmission (information theory).
What people are saying - Write a review
We haven't found any reviews in the usual places.
9 other sections not shown
Other editions - View all
arbitrarily average code length bandwidth binary digits binary symmetric channel block code bound channel capacity Chapter check bits code book code symbols code word coding theory coefficients column conditional entropy correct corresponding decoding tree detection double-error encoded message entropy function equation error-correcting codes error-detecting code example Exercises Figure frequency Gibbs inequality given Gray code Hamming codes hash hence Huffman code Huffman encoding information theory input symbols instantaneous code integral joint entropy Kraft inequality log2 Markov process mathematical matrix maximum means message positions minimum distance modulus polynomial mutual information nth extension number of errors number of l's O's and l's occur octal original P(bj parity check possible prime polynomial radius radix received message receiving end result Section sent sequence Shannon-Fano coding signaling system simple single error single-error-correcting source alphabet source symbols sphere storage Suppose syndrome term theorem uniquely decodable variable white noise zero