Elements of information theoryFollowing a brief introduction and overview, early chapters cover the basic algebraic relationships of entropy, relative entropy and mutual information, AEP, entropy rates of stochastics processes and data compression, duality of data compression and the growth rate of wealth. Later chapters explore Kolmogorov complexity, channel capacity, differential entropy, the capacity of the fundamental Gaussian channel, the relationship between information theory and statistics, rate distortion and network information theories. The final two chapters examine the stock market and inequalities in information theory. In many cases the authors actually describe the properties of the solutions before the presented problems. 
What people are saying  Write a review
User ratings
5 stars 
 
4 stars 
 
3 stars 
 
2 stars 
 
1 star 

Review: Elements of Information Theory
User Review  Jason Yang  GoodreadsCover and Thomas is THE classic information theory textbook. Here, the authors took on the ambitious task of making a comprehensive survey of (the still evolving) information theory. Admittedly, I got ... Read full review
Review: Elements of Information Theory
User Review  Huyen  GoodreadsPure gold. A classic in Information Theory. Read full review
Contents
List of Figures  1 
Entropy Relative Entropy and Mutual Information  12 
The Asymptotic Equipartition Property  50 
Copyright  
15 other sections not shown
Common terms and phrases
achievable algorithm alphabet average binary symmetric channel broadcast channel calculate capacity region chain rule channel capacity Chapter codebook codeword codeword lengths conditional entropy Consider convex corresponding data compression define Definition denote density describe differential entropy doubling rate drawn i.i.d. encoding entropy rate equal ergodic estimate example Fano's inequality feedback finite Gaussian channel given Hence Huffman code independent information theory input joint distribution jointly typical Kolmogorov complexity Kraft inequality large numbers Lemma lower bound Markov chain matrix maximizes maximum entropy minimization multiple access channel mutual information node noise optimal code output pair probability mass function probability of error problem proof prove random variable rate distortion function receiver relative entropy relay channel sample sender side information SlepianWolf source coding source sequence stationary distribution stochastic process string symbols tree typical sequences typical set uniquely decodable upper bound vector wealth XltX2