## Communication TheoryThis book is an introduction, for mathematics students, to the theories of information and codes. They are usually treated separately but, as both address the problem of communication through noisy channels (albeit from different directions), the authors have been able to exploit the connection to give a reasonably self-contained treatment, relating the probabilistic and algebraic viewpoints. The style is discursive and, as befits the subject, plenty of examples and exercises are provided. Some examples and exercises are provided. Some examples of computer codes are given to provide concrete illustrations of abstract ideas. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

Economical representations noiseless coding | 4 |

11 Sources messages codes | 5 |

the prefix condition | 8 |

Decisiontree representation | 9 |

Exercises | 10 |

13 The Kraft inequality | 11 |

Exercises | 13 |

Exercises | 16 |

Exercises | 101 |

36 Reliable transmission through the memory less Gaussian channel | 102 |

The memoryless Gaussian channel MGC | 103 |

Reliable transmission | 105 |

Exercises | 108 |

Channel coding theorems | 109 |

The Hu correspondence | 111 |

Exercises | 112 |

15 Segmented block codes | 18 |

Exercises | 19 |

16 ShannonFano encoding | 20 |

Exercises | 23 |

18 Further topics | 24 |

Universal coding | 26 |

Other reading | 27 |

19 Problems | 28 |

Properties of a message source | 34 |

Probability spaces | 35 |

The random source | 36 |

Exercise | 37 |

Telegraph English | 38 |

Exercises | 40 |

23 The First Coding Theorem FCT | 41 |

The First Coding Theorem FCT | 42 |

Exercises | 43 |

24 Asymptotic Equipartition Property AEP | 44 |

Exercise | 45 |

Exercises | 46 |

26 Finite Markov chains | 47 |

Geometric ergodicity | 48 |

Exercises | 52 |

27 Markov sources | 53 |

Higherorder Markov sources | 55 |

28 The genetic code | 56 |

Location of redundancy | 58 |

Joint entropy | 60 |

210 Conditional entropy | 62 |

Conditional independence | 63 |

Exercises | 65 |

The Ergodic Theorem | 67 |

Exercises | 68 |

The Shannon McMillanBreiman Theorem | 69 |

Failure of ergodicity | 72 |

Exercises | 73 |

213 Further topics | 74 |

Epsilonentropy metric entropy and algorithmic information theory | 77 |

Statistical inference | 78 |

Ergodic theory | 79 |

214 Problems | 80 |

Reliable transmission | 87 |

Exercise | 89 |

32 Decoding receiver optimization | 90 |

The discrete case | 91 |

ML decoding | 92 |

33 Random coding | 93 |

Exercise | 94 |

34 Introduction to channels | 95 |

Exercises | 96 |

35 Reliable transmission through the BSC | 97 |

42 The Second Coding Theorem SCT | 113 |

43 The discrete memoryless channel DMC | 117 |

Exercises | 119 |

44 Symmetric channels | 120 |

Exercises | 121 |

45 Continuous entropy and mutual information | 122 |

Mutual information | 125 |

Exercises | 126 |

46 The memoryless channel with additive white noise | 127 |

The memoryless Gaussian channel MGC | 129 |

Capacity under signalpower constraint | 130 |

Exercises | 132 |

47 Further topics | 133 |

Magnitude of the probability of error | 134 |

Channels with input costs | 135 |

Inequalities | 136 |

Other channels and systems | 137 |

Errorcontrol codes | 144 |

Exercises | 148 |

52 Linear codes | 149 |

Syndrome decoding | 153 |

Exercises | 154 |

53 Constructing codes from other codes | 155 |

Exercises | 157 |

Exercises | 161 |

55 ReedMuller codes | 162 |

Decoding ReedMuller codes | 165 |

Exercises | 167 |

56 Further topics | 168 |

57 Problems | 169 |

Cyclic codes | 172 |

Exercises | 174 |

62 BCH codes | 175 |

Exercises | 177 |

Exercise | 178 |

65 Problems | 179 |

Appendix rings fields and vector spaces | 181 |

Exercises | 183 |

72 Fields | 184 |

Exercises | 186 |

Exercises | 188 |

Exercises | 189 |

76 Problems | 190 |

Bibliography | 191 |

References | 193 |

Notation summary | 196 |

Vectors strings matrices | 197 |

Entropy information | 198 |

Algebraic objects | 199 |

200 | |

### Other editions - View all

### Common terms and phrases

algorithm alphabet Asymptotic BCH code Bernoulli source binary Cambridge channel capacity channel matrix Chapter check vectors code of length codewords Coding Theorem conditional Consider constraint continuous entropy convergence corresponding cosets cyclic code decipherable code Deduce defined Definition denote density digit emits equality iff equidistributed ergodic error probability error-control Euclidean domain Example Exercise finite field function Gibbs inequality given Hamming Hamming code Hence Huffman Huffman code independent inequality information rate information theory input integer irreducible Lemma linear code Markov chain memoryless minimal minimum distance ML decoding mutual information n-string non-negative non-zero number of elements optimal parity check matrix polynomial prefix-free code probability distribution Proof Prove random coding random variable Reed-Muller codes reliable encoding ring RM(d sequence Show simple Markov source source letters string subset subspace Suppose symbol taking values transmitted vector space word-lengths words