Science and Information Theory

Front Cover
Courier Corporation, 2004 - Science - 351 pages
1 Review
A classic source for understanding the connections between information theory and physics, this text was written by one of the giants of 20th-century physics and is appropriate for upper-level undergraduates and graduate students. Topics include the principles of coding, coding problems and solutions, the analysis of signals, a summary of thermodynamics, thermal agitation and Brownian motion, and thermal noise in an electric circuit. A discussion of the negentropy principle of information introduces the author's renowned examination of Maxwell's demon. Concluding chapters explore the associations between information theory, the uncertainty principle, and physical limits of observation, in addition to problems related to computing, organizing information, and inevitable errors.
 

What people are saying - Write a review

User Review - Flag as inappropriate

Most people today use a variety of computing devices and understand them in a cursory way, knowing that they use computer programs made up of bits and bites organized in computer languages in order to perform different tasks. Most people (including people who work with computers at very sophisticated levels) understand the role that Information Theory played in laying the foundation for today’s compputer revolution.
Information theory looks at a very fundamental issue; how is information formed into a coded means of communication and then how is it successfully transmitted and received so that the person (or machine) recieving the information correctly obtain the knowledge contained in that information. This can be two people talking to each other face to face, or earth communicating with a space probe on the edge of our solar system; both face exactly the same problems and issues.
Leon Brilouin approaches the issue by expanding on the ideas of Claude E. Shannon as articlulated in his landmark paper “A Mathematical Theory of Communication” (1948.) Shannon’s model of communication is deceptively simple, being made up of only three elements; a transmitter, a channel of communication, and a receiver. However, as Shannon demonstred in his paper, Information can be acted upon in ways that make its successful transfer within the system problematic.
Shannon used statistical analysis to demonstrate the probabilistic nature of communication, where bits of data could experience entropy and have different levels of success in being successfully transmitted and recited. The type of bit transmitted, the bandwidth of the channel used, the amount of randomness introduced into the channel, to name a few, affected the probability of information being successfully received.
Leon Brilouin takes Shannon’s statistical approach and expands it in order to further articulate the main concepts found in Shannon’s work and underscores the importance of understanding the implications of Shannon’s key concepts. More importantly, Brilouin outlines with clarity how the binary numbering system works within an Information system in order to demonstrate how it’s choice allows for the successful transmission and reception of information in a system such as a computer.
If one wishes to fully understand and appreciate how today’s digital computers are able to process the vast amounts of information that they do, both in personal computers, as well as in “big data” systems that are emerging, and to do so accurately, then this book will provide the information.
Be forwarned that this is not a book written for the lay reader, but for a university student in senior or graduate levels. It is a statistical study, and most of what will be encountered is statistical formula. However, there is still value in reading Brilouin’s systematic building of his ideas and his presentation of his conconcepts are very accessible.
Brilouin writes in a very clear and uncluttered way that shows his mastery of the topic. He understanding of Shannon’s ideas is very apparent, as well as the implications of those ideas. It is worthwhile taking the time to work through this book as it will open new levels of understanding to those who invest the time.
 

Contents

The Definition of InforMation
1
TABLE OF CONTENTS
5
Application of the Definitions and General Discussion
11
Redundancy in the English Language
21
Principles of Coding Discussion of the Capacity of
28
Appendix
49
Coding ProbleMs
51
Alphabetic Coding Ternary System
53
InforMation Theory the Uncertainty Principle and Physical LiMits of Observation
229
An Observation is an Irreversible Process
231
General Limitations in the Accuracy of Physical Measurements
232
The Limits of Euclidean Geometry
235
Possible Use of Heavy Particles Instead of Photons
236
Uncertainty Relations in the Microscope Experiment
238
Measurement of Momentum
241
Uncertainty in Field Measurements
243

Alphabet and Numbers
54
Binary Coding by Words
55
Alphabetic Coding by Words
58
Error Detecting and Correcting Codes
62
Single Error Detecting Codes
63
Single Error Correcting and Double Error Correcting Codes
66
Efficiency of SelfCorrecting Codes
67
The Capacity of a Binary Channel with Noise
69
Applications to SoMe Special ProbleMs
71
Filing with Cross Referencing
73
The Most Favorable Number of Signals per Elementary Cell
75
Fourier Method and SaMpling Procedure
78
The Gibbs Phenomenon and Convergence of Fourier Series
80
Fourier Integrals
83
The Role of Finite Frequency Band Width
87
The Uncertainty Relation for Time and Frequency
89
Degrees of Freedom of a Message
93
Shannons Sampling Method
97
Gabors Information Cells
99
Autocorrelation and Spectrum the WienerKhintchine Formula
101
Linear Transformations and Filters
103
Fourier Analysis and the Sampling Method in Three Dimensions
105
Crystal Analysis by XRays
111
Appendix Schwarz Inequality
113
SuMMary of TherModynaMics
114
Impossibility of Perpetual Motion Thermal Engines
117
Statistical Interpretation of Entropy
119
Examples of Statistical Discussions
121
Energy Fluctuations Gibbs Formula
122
Quantized Oscillator
124
Fluctuations
125
TherMal Agitation and Brownian Motion m
128
Appendix
139
The Negentropy Principle of InforMation
152
Maxwells DeMon and the Negentropy Principle
162
Appendix I
182
Observation and InforMation
202
Length Measurements with Low Accuracy
204
Length Measurements with High Accuracy
206
Efficiency of an Observation
209
Measurement of a Distance with an Interferometer
210
Another Scheme for Measuring Distance
213
The Measurement of Time Intervals
217
Observation under a Microscope
219
Discussion of the Focus in a Wave Guide
223
Examples and Discussion
226
Summary
228
The Negentropy Principle of InforMation in Tele coMMunications
245
Representation in Hyperspace
246
The Capacity of a Channel with Noise
247
Discussion of the TullerShannon Formula
248
A Practical Example
252
The Negentropy Principle Applied to the Channel with Noise
254
Gabors Modified Formula and the Role of Beats
257
Writing Printing and Reading
259
The Problem of Reading and Writing
260
Dead Information and How to Bring it Back to Life
261
Writing and Printing
263
Discussion of a Special Example
264
New Information and Redundancy
265
The ProbleM of CoMputing
267
The Computer as a Mathematical Element
269
The Computer as a Circuit Element Sampling and Desampling Linvill and Salzer
273
Computing on Sampled Data at Time
275
The Transfer Function for a Computer
277
Circuits Containing a Computer The Problem of Stability
279
Discussion of the Stability of a Program
281
A Few Examples
283
InforMation Organization and Other ProbleMs
287
Information Contained in a Physical Law
289
Information Contained in a Numerical Table
291
General Remarks
293
Examples of Problems Beyond the Present Theory
294
Problems of Semantic Information
297
Inevitable Errors DeterMinisM and InforMation
302
The Viewpoint of M Born
303
Observation and Experimental Errors
304
Laplaces Demon Exorcised
305
Anharmonic Oscillators Rectifier
308
The Anomaly of the Harmonic Oscillator
311
The Problem of Determinism
314
Information Theory and our Preceding Examples
316
Observation and Interpretation
318
Conclusions
320
The ProbleM of Very SMall Distances
321
The Possible Use of These Remarks for the Computation of Diverging Integrals in Physics
322
Electromagnetic Mass of the Electron
324
Schrodingers Zitterbewegung
325
Discussion and Possible Generalizations
326
Author Index
329
Subject Index
331
Books published by L Brillouin
349
Copyright

Other editions - View all

Common terms and phrases

References to this book

All Book Search results »

Bibliographic information