Mathematical Foundations of Information Theory
The first comprehensive introduction to information theory, this text explores the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin. Its rigorous treatment addresses the entropy concept in probability theory and fundamental theorems as well as ergodic sources, the martingale concept, anticipation and memory, and other subjects. 1957 edition.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Other editions - View all
alphabet A0 amount of information arbitrarily small channel input channel output concept conditional entropy conditional mathematical expectation conditional probability consider cylinder deﬁned deﬁnition denote the set depends distinguishable group elementary events emitted ergodic source exceed Feinstein ﬁnd ﬁnite memory ﬁnite scheme ﬁnite spaces ﬁxed follows fundamental lemma given channel given sequence given source high probability group inequality information given information theory input alphabet large number law of large Lebesgue integral Let us agree Let us denote low probability group martingale mathematical expectation McMillan’s theorem means n-term sequences number of different number of letters number of sequences obtain obviously positive integers possible probability measure probability space probability theory proof proved quantity random variable reﬂects regarded right side satisﬁed sequence 00 sequence of letters sequence wC sequences of length Shannon theorem signals special group speciﬁc stationary source suﬂiciently large symbols uncertainty uniquely determined wC W1