An introduction to information theory: symbols, signals & noiseCovers encoding and binary digits, entropy, language and meaning, efficient encoding and the noisy channel, and explores ways in which information theory relates to physics, cybernetics, psychology, and art. "Uncommonly good...the most satisfying discussion to be found." — Scientific American. 1980 edition. 
What people are saying  Write a review
User ratings
5 stars 
 
4 stars 
 
3 stars 
 
2 stars 
 
1 star 

Review: An Introduction to Information Theory: Symbols, Signals and Noise
User Review  David  GoodreadsA gentle, but someone dated (1961) introduction to information theory. Read full review
Review: An Introduction to Information Theory: Symbols, Signals and Noise
User Review  Kathleen Fredd  GoodreadsI learned a bit and enjoyed the read. Another reviewer said it was a 'gentle thorough' introduction to the topic. I can't speak to the thorough, not my field, but it was gentle. Pierce has a sense of ... Read full review
Contents
THE WORLD AND THEORIES  1 
THE ORIGINS OF INFORMATION THEORY  19 
A MATHEMATICAL MODEL  45 
Copyright  
14 other sections not shown
Other editions  View all
An Introduction to Information Theory: Symbols, Signals and Noise John R. Pierce Limited preview  2012 
An Introduction to Information Theory: Symbols, Signals & Noise John Robinson Pierce Limited preview  1980 
An Introduction to Information Theory: Symbols, Signals & Noise John Robinson Pierce No preview available  1980 
Common terms and phrases
amplifier amplitude average number band width bandlimited signal binary digits binary numbers bits per second block called channel capacity Chapter characters choice circuit communication theory corresponding current values cybernetics digits per second digram efficient encoding electrical energy English text entropy equal equation ergodic source error example Figure finitestate machine fraction frequencies given grammatical Huffman code human hypersphere illustrations important information rate information theory input instance Johnson noise language large number linear logarithm mathematicians Maxwell's equations means measure message source molecule motion negative feedback network theory noise power noisy channel number of binary Nyquist octal output particular physical possible prediction probability problem pulse radiation radio random received represent samples sent sentence Shannon signal power simple sine wave sort sound space specify speech square statistical mechanics Suppose telegraph telegraphy temperature theorem tion transmission transmit uncertainty vocoder words write Zipf's law