## Science and Information TheoryA classic source for understanding the connections between information theory and physics, this text was written by one of the giants of 20th-century physics and is appropriate for upper-level undergraduates and graduate students. Topics include the principles of coding, coding problems and solutions, the analysis of signals, a summary of thermodynamics, thermal agitation and Brownian motion, and thermal noise in an electric circuit. A discussion of the negentropy principle of information introduces the author's renowned examination of Maxwell's demon. Concluding chapters explore the associations between information theory, the uncertainty principle, and physical limits of observation, in addition to problems related to computing, organizing information, and inevitable errors. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

The Definition of InforMation | 1 |

TABLE OF CONTENTS | 5 |

Application of the Definitions and General Discussion | 11 |

Redundancy in the English Language | 21 |

Principles of Coding Discussion of the Capacity of | 28 |

Appendix | 49 |

Coding ProbleMs | 51 |

Alphabetic Coding Ternary System | 53 |

InforMation Theory the Uncertainty Principle and Physical LiMits of Observation | 229 |

An Observation is an Irreversible Process | 231 |

General Limitations in the Accuracy of Physical Measurements | 232 |

The Limits of Euclidean Geometry | 235 |

Possible Use of Heavy Particles Instead of Photons | 236 |

Uncertainty Relations in the Microscope Experiment | 238 |

Measurement of Momentum | 241 |

Uncertainty in Field Measurements | 243 |

Alphabet and Numbers | 54 |

Binary Coding by Words | 55 |

Alphabetic Coding by Words | 58 |

Error Detecting and Correcting Codes | 62 |

Single Error Detecting Codes | 63 |

Single Error Correcting and Double Error Correcting Codes | 66 |

Efficiency of SelfCorrecting Codes | 67 |

The Capacity of a Binary Channel with Noise | 69 |

Applications to SoMe Special ProbleMs | 71 |

Filing with Cross Referencing | 73 |

The Most Favorable Number of Signals per Elementary Cell | 75 |

Fourier Method and SaMpling Procedure | 78 |

The Gibbs Phenomenon and Convergence of Fourier Series | 80 |

Fourier Integrals | 83 |

The Role of Finite Frequency Band Width | 87 |

The Uncertainty Relation for Time and Frequency | 89 |

Degrees of Freedom of a Message | 93 |

Shannons Sampling Method | 97 |

Gabors Information Cells | 99 |

Autocorrelation and Spectrum the WienerKhintchine Formula | 101 |

Linear Transformations and Filters | 103 |

Fourier Analysis and the Sampling Method in Three Dimensions | 105 |

Crystal Analysis by XRays | 111 |

Appendix Schwarz Inequality | 113 |

SuMMary of TherModynaMics | 114 |

Impossibility of Perpetual Motion Thermal Engines | 117 |

Statistical Interpretation of Entropy | 119 |

Examples of Statistical Discussions | 121 |

Energy Fluctuations Gibbs Formula | 122 |

Quantized Oscillator | 124 |

Fluctuations | 125 |

TherMal Agitation and Brownian Motion m | 128 |

Appendix | 139 |

The Negentropy Principle of InforMation | 152 |

Maxwells DeMon and the Negentropy Principle | 162 |

Appendix I | 182 |

Observation and InforMation | 202 |

Length Measurements with Low Accuracy | 204 |

Length Measurements with High Accuracy | 206 |

Efficiency of an Observation | 209 |

Measurement of a Distance with an Interferometer | 210 |

Another Scheme for Measuring Distance | 213 |

The Measurement of Time Intervals | 217 |

Observation under a Microscope | 219 |

Discussion of the Focus in a Wave Guide | 223 |

Examples and Discussion | 226 |

Summary | 228 |

The Negentropy Principle of InforMation in Tele coMMunications | 245 |

Representation in Hyperspace | 246 |

The Capacity of a Channel with Noise | 247 |

Discussion of the TullerShannon Formula | 248 |

A Practical Example | 252 |

The Negentropy Principle Applied to the Channel with Noise | 254 |

Gabors Modified Formula and the Role of Beats | 257 |

Writing Printing and Reading | 259 |

The Problem of Reading and Writing | 260 |

Dead Information and How to Bring it Back to Life | 261 |

Writing and Printing | 263 |

Discussion of a Special Example | 264 |

New Information and Redundancy | 265 |

The ProbleM of CoMputing | 267 |

The Computer as a Mathematical Element | 269 |

The Computer as a Circuit Element Sampling and Desampling Linvill and Salzer | 273 |

Computing on Sampled Data at Time | 275 |

The Transfer Function for a Computer | 277 |

Circuits Containing a Computer The Problem of Stability | 279 |

Discussion of the Stability of a Program | 281 |

A Few Examples | 283 |

InforMation Organization and Other ProbleMs | 287 |

Information Contained in a Physical Law | 289 |

Information Contained in a Numerical Table | 291 |

General Remarks | 293 |

Examples of Problems Beyond the Present Theory | 294 |

Problems of Semantic Information | 297 |

Inevitable Errors DeterMinisM and InforMation | 302 |

The Viewpoint of M Born | 303 |

Observation and Experimental Errors | 304 |

Laplaces Demon Exorcised | 305 |

Anharmonic Oscillators Rectifier | 308 |

The Anomaly of the Harmonic Oscillator | 311 |

The Problem of Determinism | 314 |

Information Theory and our Preceding Examples | 316 |

Observation and Interpretation | 318 |

Conclusions | 320 |

The ProbleM of Very SMall Distances | 321 |

The Possible Use of These Remarks for the Computation of Diverging Integrals in Physics | 322 |

Electromagnetic Mass of the Electron | 324 |

Schrodingers Zitterbewegung | 325 |

Discussion and Possible Generalizations | 326 |

Author Index | 329 |

331 | |

Books published by L Brillouin | 349 |

### Other editions - View all

### Common terms and phrases

according to Eq accuracy amount of information amplitude assume atoms average energy beam binary digits bits per letter Brownian motion C. E. Shannon cell channel Chapter circuit coding coefficients compute condition constant contains corresponds defined definition degrees of freedom demon device discussion duration efficiency electron entropy increase equation error example experimental finite fluctuations formula Fourier series function gives heat hence high frequencies increase of entropy information theory integral interval kinetic energy length Let us consider limit low frequencies maximum Maxwell's demon measure method molecule negative negentropy negentropy principle noise observation operation oscillations particle periodic function Planck's constant positive possible problem pulses punched tape quantity quantum random represents resonator result Section Shannon signal similar spectrum symbols temperature theory thermal agitation thermal noise thermodynamics tion total energy total number uncertainty relation velocity wavelengths words yields zero