Information TheoryStudents of electrical engineering or applied mathematics can find no clearer presentation of the principles of information theory than this excellent introduction. After explaining the nature of information theory and its problems, the author examines a variety of important topics: information theory of discrete systems; properties of continuous signals; ergodic ensembles and random noise; entropy of continuous distributions; the transmission of information in band-limited systems having a continuous range of values; an introduction to the use of signal space; information theory aspects of modulation and noise reduction; and linear correlation, filtering, and prediction. Numerous problems appear throughout the text, many with complete solutions. 1953 ed. |
Contents
SOME PROPERTIES OF CONTINUOUS SIGNALS | 65 |
ERGODIC ENSEMBLES AND RANDOM NOISE | 85 |
THE ENTROPY OF CONTINUOUS DISTRIBUTIONS | 127 |
Copyright | |
9 other sections not shown
Other editions - View all
Common terms and phrases
alphabet amplitude amplitude modulation Appendix approximately auto-correlation auto-correlation function average information average power average quadiva average value bandwidth binits calculate Chap code capacity code channel coherent consider correlation cross-correlation desired signal different possible duration encoding English ensemble average ensemble of functions entropy power Equation ergodic ensemble ergodic sequence ergodic system example filter finite fixed constraints formula Fourier Fourier series frequency components frequency domain frequency modulation G₁(t gaussian noise Hilbert transform Hz(y information theory input integral intersymbol influence language transmission capacity large number limited linear liniva mathematical maximum entropy mean-square noise reduction number of different number of possible occur Paperbound physically realizable possible messages power spectrum prediction probability constraints probability distribution problem pulse quadratic content random noise range received signal sample points shown superimposed superposition tion transducer transmission of information transmitted signal typical function variables white gaussian noise zero ηπ Σπη ΣΡ
References to this book
Mathematical Foundations of Information Theory Aleksandr I?Akovlevich Khinchin No preview available - 1957 |