Information and Coding Theory

Front Cover
Springer Science & Business Media, Jun 26, 2000 - Mathematics - 210 pages
As this Preface is being written, the twentieth century is coming to an end. Historians may perhaps come to refer to it as the century of information, just as its predecessor is associated with the process of industrialisation. Successive technological developments such as the telephone, radio, television, computers and the Internet have had profound effects on the way we live. We can see pic tures of the surface of Mars or the early shape of the Universe. The contents of a whole shelf-load of library books can be compressed onto an almost weight less piece of plastic. Billions of people can watch the same football match, or can keep in instant touch with friends around the world without leaving home. In short, massive amounts of information can now be stored, transmitted and processed, with surprising speed, accuracy and economy. Of course, these developments do not happen without some theoretical ba sis, and as is so often the case, much of this is provided by mathematics. Many of the first mathematical advances in this area were made in the mid-twentieth century by engineers, often relying on intuition and experience rather than a deep theoretical knowledge to lead them to their discoveries. Soon the math ematicians, delighted to see new applications for their subject, joined in and developed the engineers' practical examples into wide-ranging theories, com plete with definitions, theorems and proofs.
 

What people are saying - Write a review

User Review - Flag as inappropriate

This is an excellent book for researchers working in Coding Theory but may not be useful to undergraduate students

User Review - Flag as inappropriate

This book provides hardly any examples, proofs or explanations. I found it absolutely useless.

Contents

Source Coding
1
12 Uniquely Decodable Codes
4
13 Instantaneous Codes
9
14 Constructing Instantaneous Codes
11
15 Krafts Inequality
13
16 McMillans Inequality
14
17 Comments on Krafts and McMillans Inequalities
16
18 Supplementary Exercises
17
Using an Unreliable Channel
79
52 An Example of Improved Reliability
82
53 Hamming Distance
85
54 Statement and Outline Proof of Shannons Theorem
88
55 The Converse of Shannons Theorem
90
56 Comments on Shannons Theorem
93
57 Supplementary Exercises
94
Errorcorrecting Codes
97

Optimal Codes
19
22 Binary Huffman Codes
22
23 Average Wordlength of Huffman Codes
26
24 Optimality of Binary Huffman Codes
27
25 rary Huffman Codes
28
26 Extensions of Sources
30
27 Supplementary Exercises
32
Entropy
35
32 Properties of the Entropy Function
40
33 Entropy and Average Wordlength
42
34 ShannonFano Coding
45
35 Entropy of Extensions and Products
47
36 Shannons First Theorem
48
37 An Example of Shannons First Theorem
49
38 Supplementary Exercises
51
Information Channels
55
42 The Binary Symmetric Channel
60
43 System Entropies
62
44 System Entropies for the Binary Symmetric Channel
64
45 Extension of Shannons First Theorem to Information Channels
67
46 Mutual Information
70
47 Mutual Information for the Binary Symmetric Channel
72
48 Channel Capacity
73
49 Supplementary Exercises
76
62 Examples of Codes
100
63 Minimum Distance
104
64 Hammings Spherepacking Bound
107
65 The GilbertVarshamov Bound
111
66 Hadamard Matrices and Codes
114
67 Supplementary Exercises
118
Linear Codes
121
72 Equivalence of Linear Codes
127
73 Minimum Distance of Linear Codes
131
74 The Hamming Codes
133
75 The Golay Codes
136
76 The Standard Array
141
77 Syndrome Decoding
143
78 Supplementary Exercises
146
Suggestions for Further Reading
149
Proof of the SardinasPatterson Theorem
153
The Law of Large Numbers
157
Proof of Shannons Fundamental Theorem
159
Solutions to Exercises
165
Bibliography
191
Index of Symbols and Abbreviations
195
Index
201
Copyright

Other editions - View all

Common terms and phrases