## Abstract Methods in Information TheoryInformation Theory is studied from the following view points: (1) the theory of entropy as amount of information; (2) the mathematical structure of information sources (probability measures); and (3) the theory of information channels. Shannon entropy and Kolmogorov-Sinai entropy are defined and their basic properties are examined, where the latter entropy is extended to be a linear functional on a certain set of measures. Ergodic and mixing properties of stationary sources are studied as well as AMS (asymptotically mean stationary) sources. The main purpose of this book is to present information channels in the environment of real and functional analysis as well as probability theory. Ergodic channels are characterized in various manners. Mixing and AMS channels are also considered in detail with some illustrations. A few other aspects of information channels including measurability, approximation and noncommutative extensions, are also discussed. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Other editions - View all

### Common terms and phrases

5-invariant a-subalgebra abelian group algebraic model alphabet message space AMS channels AMS sources Baire Banach space channel operators clopen coding theorems compact Hausdorff space conditional entropy consider Convergence Theorem convex Corollary countable cr-algebra CS(X defined Definition denotes the set dynamical systems entropy functional ergodic decomposition ergodic source exists fc=i Feinstein's fundamental lemma following conditions function h Hausdorff space Hence holds homeomorphism implies information theory input source integral representation integration channel isomorphic Jx Jx Kakihara Kolmogorov-Sinai Kolmogorov-Sinai entropy Lemma Ll(X M+(X Math measurable functions measurable space measurable transformation modP modPse(X MS(X n-foo n(dx norm Note obtained one-to-one Pa(X Pae(X Pointwise Ergodic Theorem probability measure Proo Proposition proved PS(X Pse(X relative entropy satisfies semiergodic sequence Shannon Shannon entropy stationary channel stationary mean stationary source strongly measurable strongly mixing Suppose topology transmission rate Umegaki unitary operator verified weakly mixing