Students of electrical engineering or applied mathematics can find no clearer presentation of the principles of information theory than this excellent introduction. After explaining the nature of information theory and its problems, the author examines a variety of important topics: information theory of discrete systems; properties of continuous signals; ergodic ensembles and random noise; entropy of continuous distributions; the transmission of information in band-limited systems having a continuous range of values; an introduction to the use of signal space; information theory aspects of modulation and noise reduction; and linear correlation, filtering, and prediction. Numerous problems appear throughout the text, many with complete solutions. 1953 ed.
What people are saying - Write a review
We haven't found any reviews in the usual places.
INFORMATION THEORY OF DISCREET SYSTEMS
SOME PROPERTIES OF CONTINUOUS SIGNALS
ERGODIC ENSEMBLES AND RANDOM NOISE
17 other sections not shown
Other editions - View all
according actual alphabet amount analysis Appendix approximately assume average bandwidth calculate called channel Chap completely components consider constraints continuous correlation defined definition derivation desired determined discussion distribution domain duration effect English entropy equal Equation ergodic ensemble error example expressed fact filter finite follows formula Fourier frequency function gaussian give given groups important increased independent information theory input integral intersymbol influence language language information language transmission capacity letter limited linear maximum mean measure method modulation normal obtained occur output particular possible practical prediction present probability problem properties pulse random noise range ratio received reduces represents result sample points sequence shown signal space spectrum Suppose symbols theorem tion transform transmission transmitted typical unit zero