An Introduction to the Modeling of Neural Networks
This text is a graduate-level introduction to neural networks, focusing on current theoretical models, examining what these models can reveal about how the brain functions, and discussing the ramifications for psychology, artificial intelligence, and the construction of a new generation of intelligent computers. The book is divided into four parts. The first part gives an account of the anatomy of the central nervous system, followed by a brief introduction to neurophysiology. The second part is devoted to the dynamics of neuronal states, and demonstrates how very simple models may stimulate associative memory. The third part of the book discusses models of learning, including detailed discussions on the limits of memory storage, methods of learning and their associated models, associativity, and error correction. The final section of the book reviews possible applications of neural networks in artificial intelligence, expert systems, optimization problems, and the construction of actual neuronal supercomputers, with the potential for one-hundred fold increase in speed over contemporary serial machines.
What people are saying - Write a review
We haven't found any reviews in the usual places.
a brief historical overview
13 Organization of the book
The biology of neural networks a few features for the sake of nonbiologists
22 The anatomy of central nervous systems
23 A brief survey of neurophysiology
a summary of experimental observations
The dynamics of neural networks a stochastic approach
Solving the problem of credit assignment
82 Handling internal representations
83 Learning in Boolean networks
93 Three questions about learning
32 Noiseless neural networks
33 Taking synaptic noise into account
Hebbian models of associative memory
42 Stochastic Hebbian neural networks in the limit of finite numbers of memorized patterns
the technique of field distributions
44 The replica method approach
45 General dynamics of neural networks
Temporal sequences of patterns
52 Stochastic dynamics
53 An example of conditioned behavior
The problem of learning in neural networks
62 Linear separability
63 Computing the volume of solutions
Learning dynamics in visible neural networks
72 Constraining the synaptic efficacies
73 Projection algorithms
74 The perceptron learning rules
75 Correlated patterns
103 Lowlevel signal processing
104 Pattern matching
105 Some speculations on biological systems
106 Higher associative functions
112 Semiparallel neurocomputers
A critical view of the modeling of neural networks
122 The neural code
123 The synfire chains
124 Computing with attractors versus computing with flows of information
125 The issue of low neuronal activities
126 Learning and cortical plasticity
127 Taking the modular organization of the cortex into account
the problem of artificial intelligence
129 Concluding remarks
Other editions - View all
according action potential algorithm architecture associated assume average activity axon basins of attraction behavior binary Boolean function cells classical conditioning coding comprising computation connected networks consider constraints correlations cortex cortical cost function defined determined distribution eigenvalues equations example excitatory exist feedforward Figure fixed points fully connected fully connected networks given Hamming distance Hebbian rule hidden units hippocampus Hopfield inhibitory input units interactions internal representation Kohonen learning dynamics learning rule limit linear machine maps matrix mechanism membrane potential memorized patterns memory storage capacity modified neural networks neurocomputers neuronal dynamics noise number of patterns observed optimization order parameters output unit parallel perceptron perceptron algorithm perceptron rule phase space possible postsynaptic probability problem properties random semi-parallel signals solution spin glasses stimuli stored structure symmetrical synaptic efficacies theory of neural threshold updated vector visible units visual yields zero
Page 447 - Fast adaptive formation of orthogonalizing filters and associative memory in recurrent networks of neuron-like elements.
Page 462 - Feigel'man MV 1988 The enhanced storage capacity in neural networks with low activity level Europhys.
Page 432 - Simulation of anticipatory responses in classical conditioning by a neuron-like adaptive element.
Page 449 - Nonlinear signal processing using neural networks: Prediction and system modeling.
Page 448 - W. Krauth, J.-P. Nadal, and M.Mezard (1988) The Roles of Stability and Symmetry in the Dynamics of Neural Networks. J. Phys. A: Math. Gen. 21 pp2995-3011.
Page 451 - Llinas, R. , Sugimori, M. and Simon, SM , Transmission by presynaptic spike-like depolarization in the squid giant synapse, Proc. Natl. Acad. Sci. (USA) 79:2415-2419 (1982).
Page 437 - Del Castillo J, Katz B: Statistical factors involved in neuromuscular facilitation and depression. J. Physiol.
Page 447 - Associative neural network model for the generation of temporal patterns: Theory and application to central pattern generators. Biophysical J. 54, 1039-1051. 11. Kleinfeld, D. and Sompolinsky, H. (1989). Associative network models for central pattern generators. In "Methods in Neuronal Modeling: From Synapses to Networks