Mathematical Perspectives on Neural Networks

Front Cover
Paul Smolensky, Michael C. Mozer, David E. Rumelhart
Psychology Press, 1996 - Computers - 862 pages
0 Reviews
Recent years have seen an explosion of new mathematical results on learning and processing in neural networks. This body of results rests on a breadth of mathematical background which even few specialists possess. In a format intermediate between a textbook and a collection of research articles, this book has been assembled to present a sample of these results, and to fill in the necessary background, in such areas as computability theory, computational complexity theory, the theory of analog computation, stochastic processes, dynamical systems, control theory, time-series analysis, Bayesian analysis, regularization theory, information theory, computational learning theory, and mathematical statistics.

Mathematical models of neural networks display an amazing richness and diversity. Neural networks can be formally modeled as computational systems, as physical or dynamical systems, and as statistical analyzers. Within each of these three broad perspectives, there are a number of particular approaches. For each of 16 particular mathematical perspectives on neural networks, the contributing authors provide introductions to the background mathematics, and address questions such as:
* Exactly what mathematical systems are used to model neural networks from the given perspective?
* What formal questions about neural networks can then be addressed?
* What are typical results that can be obtained? and
* What are the outstanding open problems?

A distinctive feature of this volume is that for each perspective presented in one of the contributed chapters, the first editor has provided a moderately detailed summary of the formal results and the requisite mathematical concepts. These summaries are presented in four chapters that tie together the 16 contributed chapters: three develop a coherent view of the three general perspectives -- computational, dynamical, and statistical; the other assembles these three perspectives into a unified overview of the neural networks field.
 

What people are saying - Write a review

We haven't found any reviews in the usual places.

Contents

Computational Dynamical and Statistical
1
Stan Franklin Institute for Intelligent Systems Department of Mathematical
10
Computational Perspectives
17
Computation by Discrete Neural Nets
41
Circuit Complexity and Feedforward Neural Networks
85
Complexity of Learning
113
Deterministic and Randomized Local Search
143
Dynamical Perspectives on Neural Networks
245
Regularization in Neural Nets
497
The Basic Theory
533
Information Theory and Neural Nets
567
Hidden Markov Models and Some Connections
603
Probably Approximately Correct Learning
651
Richard Golden School of Human Development University of Texas at Dallas
688
David Haussler Department of Computer Science University of California
715
Parametric Statistical Estimation with Artificial
719

Dynamical Systems
271
Statistical Analysis of Neural Networks
325
Neural Networks in Control Systems
347
Time Series Analysis and Prediction
395
Statistical Perspectives on Neural Networks
453
Inductive Principles of Statistics and Learning Theory
777
Author Index
843
Subject Index
855
Copyright

Other editions - View all

Common terms and phrases

References to this book

All Book Search results »

About the author (1996)

Paul Smolensky is Professor of Cognitive Science at Johns Hopkins University. He was a leading member of the PDP connectionist research group, and is the recipient of the 2005 David E. Rumelhart Prize in Cognitive Science, which is awarded annually to an individual or collaborative team making a significant contribution to the formal analysis of human cognition.

Michael C. Mozer is a Professor in the Department of Computer Science and the Institute of Cognitive Science at the University of Colorado, Boulder. In 1990 he received the Presidential Young Investigator Award from the National Science Foundation.

David E. Rumelhart (1942-2011) served as Professor of Psychology at the University of California, San Diego and Stanford University. With James McClelland, he was awarded the 2002 University of Louisville Grawemeyer Award for Psychology for his work in the field of cognitive neuroscience on a cognitive framework called parallel distributed processing and the concept of connectionism.

Bibliographic information