## Information TheoryDeveloped by Claude Shannon and Norbert Wiener in the late Forties, information theory, or statistical communication theory, deals with the theoretical underpinnings of a wide range of communication devices: radio, television, radar, computers, telegraphy, and more. This book is an excellent introduction to the mathematics underlying the theory. Designed for upper-level undergraduates and first-year graduate students, the book treats three major areas: analysis of channel models and proof of coding theorems (Chapters 3, 7 and 8); study of specific coding systems (Chapters 2, 4, and 5); and study of statistical properties of information sources (Chapter 6). Among the topics covered are noiseless coding, the discrete memoryless channel, error correcting codes, information sources, channels with memory and continuous channels. The author has tried to keep the prerequisites to a minimum. However, students should have a knowledge of basic probability theory. Some measure and Hilbert space theory is helpful as well for the last two sections of Chapter 8, which treat time-continuous channels. An appendix summarizes the Hilbert space background and the results from the theory of stochastic processes necessary for these sections. The appendix is not self-contained, but will serve to pinpoint some of the specific equipment needed for the analysis of time-continuous channels. In addition to historic notes at the end of each chapter indicating the origin of some of the results, the author has also included 60 problems, with detailed solutions, making the book especially valuable for independent study. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

I | 1 |

II | 5 |

III | 12 |

IV | 16 |

V | 21 |

VI | 24 |

VII | 27 |

VIII | 28 |

XXXII | 134 |

XXXIII | 138 |

XXXIV | 147 |

XXXV | 156 |

XXXVI | 161 |

XXXVII | 163 |

XXXVIII | 169 |

XXXIX | 172 |

IX | 33 |

X | 35 |

XI | 36 |

XII | 40 |

XIII | 43 |

XIV | 46 |

XV | 49 |

XVI | 53 |

XVII | 60 |

XVIII | 63 |

XIX | 77 |

XX | 80 |

XXI | 83 |

XXII | 87 |

XXIII | 89 |

XXIV | 91 |

XXV | 95 |

XXVI | 105 |

XXVII | 110 |

XXVIII | 113 |

XXIX | 124 |

XXX | 126 |

XXXI | 127 |

### Common terms and phrases

alphabet assume average probability binary matrix binary sequences binary symmetric channel bound channel capacity check digits code word code-word length coding theorem column construct converges corrector corresponding coset cyclic code decision scheme decoding set define discrete memoryless channel eigenvalue equality error pattern example Fourier transform function fundamental theorem given Hamming bound hence input distribution input sequence instantaneous code Lemma linearly independent lt follows Markov chain Markov source memoryless channel minimal polynomial modulo n-sequences nonzero output sequence P{Xt parity check code parity check matrix positive integer probability of error problem proof of Theorem prove random variables real number received sequence result sequence of codes sequences of length stationary distribution steady state probabilities Suppose symbols theory transmission rate transmitted uncertainty uniquely decipherable vector word length words of length zero