## Information and Coding TheoryAs this Preface is being written, the twentieth century is coming to an end. Historians may perhaps come to refer to it as the century of information, just as its predecessor is associated with the process of industrialisation. Successive technological developments such as the telephone, radio, television, computers and the Internet have had profound effects on the way we live. We can see pic tures of the surface of Mars or the early shape of the Universe. The contents of a whole shelf-load of library books can be compressed onto an almost weight less piece of plastic. Billions of people can watch the same football match, or can keep in instant touch with friends around the world without leaving home. In short, massive amounts of information can now be stored, transmitted and processed, with surprising speed, accuracy and economy. Of course, these developments do not happen without some theoretical ba sis, and as is so often the case, much of this is provided by mathematics. Many of the first mathematical advances in this area were made in the mid-twentieth century by engineers, often relying on intuition and experience rather than a deep theoretical knowledge to lead them to their discoveries. Soon the math ematicians, delighted to see new applications for their subject, joined in and developed the engineers' practical examples into wide-ranging theories, com plete with definitions, theorems and proofs. |

### What people are saying - Write a review

User Review - Flag as inappropriate

This is an excellent book for researchers working in Coding Theory but may not be useful to undergraduate students

User Review - Flag as inappropriate

This book provides hardly any examples, proofs or explanations. I found it absolutely useless.

### Contents

Source Coding | 1 |

12 Uniquely Decodable Codes | 4 |

13 Instantaneous Codes | 9 |

14 Constructing Instantaneous Codes | 11 |

15 Krafts Inequality | 13 |

16 McMillans Inequality | 14 |

17 Comments on Krafts and McMillans Inequalities | 16 |

18 Supplementary Exercises | 17 |

Using an Unreliable Channel | 79 |

52 An Example of Improved Reliability | 82 |

53 Hamming Distance | 85 |

54 Statement and Outline Proof of Shannons Theorem | 88 |

55 The Converse of Shannons Theorem | 90 |

56 Comments on Shannons Theorem | 93 |

57 Supplementary Exercises | 94 |

Errorcorrecting Codes | 97 |

Optimal Codes | 19 |

22 Binary Huffman Codes | 22 |

23 Average Wordlength of Huffman Codes | 26 |

24 Optimality of Binary Huffman Codes | 27 |

25 rary Huffman Codes | 28 |

26 Extensions of Sources | 30 |

27 Supplementary Exercises | 32 |

Entropy | 35 |

32 Properties of the Entropy Function | 40 |

33 Entropy and Average Wordlength | 42 |

34 ShannonFano Coding | 45 |

35 Entropy of Extensions and Products | 47 |

36 Shannons First Theorem | 48 |

37 An Example of Shannons First Theorem | 49 |

38 Supplementary Exercises | 51 |

Information Channels | 55 |

42 The Binary Symmetric Channel | 60 |

43 System Entropies | 62 |

44 System Entropies for the Binary Symmetric Channel | 64 |

45 Extension of Shannons First Theorem to Information Channels | 67 |

46 Mutual Information | 70 |

47 Mutual Information for the Binary Symmetric Channel | 72 |

48 Channel Capacity | 73 |

49 Supplementary Exercises | 76 |

62 Examples of Codes | 100 |

63 Minimum Distance | 104 |

64 Hammings Spherepacking Bound | 107 |

65 The GilbertVarshamov Bound | 111 |

66 Hadamard Matrices and Codes | 114 |

67 Supplementary Exercises | 118 |

Linear Codes | 121 |

72 Equivalence of Linear Codes | 127 |

73 Minimum Distance of Linear Codes | 131 |

74 The Hamming Codes | 133 |

75 The Golay Codes | 136 |

76 The Standard Array | 141 |

77 Syndrome Decoding | 143 |

78 Supplementary Exercises | 146 |

Suggestions for Further Reading | 149 |

Proof of the SardinasPatterson Theorem | 153 |

The Law of Large Numbers | 157 |

Proof of Shannons Fundamental Theorem | 159 |

Solutions to Exercises | 165 |

191 | |

Index of Symbols and Abbreviations | 195 |

201 | |

### Other editions - View all

### Common terms and phrases

alphabet average word-length L(C binary Huffman code binary symmetric channel blocks channel matrix Chapter code of length code-sequence code-words code-words wi Coding Theory columns of H construct correct coset decision rule define encoding entropy equality equations equiprobable equivalent error-correcting error-pattern error-probability errors Example finite function given gives Golay codes Hadamard matrix Hamming code hence Hr(S independent inequality input symbol instantaneous code instantaneous r-ary code integer Large Numbers Lemma linear code log2 logr matrix H maximise maximum likelihood rule minimum distance mutual information nearest neighbour decoding non-zero optimal code output symbols parity-check matrix plogp prefix code probabilities pi probability distribution proof repetition code rows satisfying sequence Shannon-Fano code Shannon's Fundamental Theorem Shannon's Theorem Show source-symbols sphere-packing bound standard array Steiner system subsets subspace Supplementary Exercises ternary transmitted upper bound vectors