## Information Theory: Coding Theorems for Discrete Memoryless SystemsCsiszár and Körner's book is widely regarded as a classic in the field of information theory, providing deep insights and expert treatment of the key theoretical issues. It includes in-depth coverage of the mathematics of reliable information transmission, both in two-terminal and multi-terminal network scenarios. Updated and considerably expanded, this new edition presents unique discussions of information theoretic secrecy and of zero-error information theory, including the deep connections of the latter with extremal combinatorics. The presentations of all core subjects are self contained, even the advanced topics, which helps readers to understand the important connections between seemingly different problems. Finally, 320 end-of-chapter problems, together with helpful hints for solving them, allow readers to develop a full command of the mathematical techniques. It is an ideal resource for graduate students and researchers in electrical and electronic engineering, computer science and applied mathematics. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Other editions - View all

Information Theory: Coding Theorems for Discrete Memoryless Systems Imre Csiszár,János Körner Limited preview - 2014 |

Information Theory: Coding Theorems for Discrete Memoryless Systems Imre Csiszár,János Körner No preview available - 1997 |

### Common terms and phrases

a-capacity achievable rate region Ahlswede Alice and Bob alphabet arbitrary assertion binary capacity region channel coding channel network Chapter characterization code f codeword set coding theorem compound channel consider convex Corollary corresponding Csiszár decoder deﬁned deﬁnition denote digraph distortion measure DMMS encoder f entropy equals error exponent error probability Esp(R exists fidelity criterion ﬁrst ﬁxed follows fork network function given graph Hint hypergraph implies information theory input Körner Lemma mapping Markov chain matrix message set mutual information n-length block code output prefix code probability of error Problem proof of Theorem prove Q Proof random code random selection resp result satisﬁes satisfying sequences Show SK capacity source model source network stochastic encoder stochastic matrix subset sufﬁces sufﬁciently large transmission upper bound variable-length codes vertex vertices zero zero-error capacity