## Coding Theorems of Information TheoryThe objective of the present edition of this monograph is the same as that of earlier editions, namely, to provide readers with some mathemati cal maturity a rigorous and modern introduction to the ideas and principal theorems of probabilistic information theory. It is not necessary that readers have any prior knowledge whatever of information theory. The rapid development of the subject has had the consequence that any one book can now cover only a fraction of the literature. The latter is often written by engineers for engineers, and the mathematical reader may have some difficulty with it. The mathematician who understands the content and methods of this monograph should be able to read the literature and start on research of his own in a subject of mathematical beauty and interest. The present edition differs from the second in the following: Chapter 6 has been completely replaced by one on arbitrarily varying channels. Chapter 7 has been greatly enlarged. Chapter 8 on semi-continuous channels has been drastically shortened, and Chapter 11 on sequential decoding completely removed. The new Chapters 11-15 consist entirely of material which has been developed only in the last few years. The topics discussed are rate distortion, source coding, multiple access channels, and degraded broadcast channels. Even the specialist will find a new approach in the treatment of these subjects. Many of the proofs are new, more perspicuous, and considerably shorter than the original ones. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

Heuristic Introduction to the Discrete Memoryless Channel | 1 |

sender and receiver | 22 |

Compound Channels | 33 |

Copyright | |

10 other sections not shown

### Other editions - View all

### Common terms and phrases

arbitrarily varying channel arbitrary argument asymptotic equipartition property binary symmetric channel called capacity Cartesian product channel sequence Chapter Chebyshev's inequality code n coding theorem compound channel concave function convex coordinates corresponding cylinder set decoding sets defined disjoint disjoint sets encoder entropy ergodic exists a code exp2 finite follows governs the transmission Hence independent chance variables input alphabet integer knows the c.p.f. left member Lemma Let v0 letter Markov chain memoryless channel n-sphere notation number of elements number of sequences obtain obviously output alphabet pair probability distribution probability of error probability space probability vector problem proof of Theorem prove Theorem proves the theorem rate triple received sequence receiver knows resp right member satisfies Section 5.1 sender and receiver sent set of n-sequences sn-input space stochastic input strong converse subcode sufficiently large transmitted upper bound vector weak converse Wolfowitz word write zero