Neural Networks for Pattern Recognition

Front Cover
Clarendon Press, Nov 23, 1995 - Computers - 482 pages
16 Reviews
This book provides the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts of pattern recognition, the book describes techniques for modelling probability density functions, and discusses the properties and relative merits of the multi-layer perceptron and radial basis function network models. It also motivates the use of various forms of error functions, and reviews the principal algorithms for error function minimization. As well as providing a detailed discussion of learning and generalization in neural networks, the book also covers the important topics of data processing, feature extraction, and prior knowledge. The book concludes with an extensive treatment of Bayesian techniques and their applications to neural networks.
  

What people are saying - Write a review

User ratings

5 stars
7
4 stars
6
3 stars
2
2 stars
0
1 star
1

User Review - Flag as inappropriate

It has been a long way since 1995, and many new techniques and important developments have taken place in the field of A.I. and more concretely, machine learning. Still, this book has aged very well, for two reasons: first, the fundamental techniques and concepts that every practitioner must understand and be able to make use of, like for example parametric techniques for density estimation (kNN), dimensionality reduction (PCA), mixture models, in addition to, of course, neural networks. Second, this book paves the way for moving on to modern techniques like deep energy models and deep belief networks with its last chapter on bayesian techniques.
The explanations are clear and amenable to read. Properties of and advances based on neural networks are presented in a principled way in the context of statistical pattern recognition. The exercises are wisely chosen to ensure the understanding of the presented results, and under what conditions they were derived.
But this book goes beyond theory, A chapter is devoted to optimization techniques, i.e. what algorithms are used to train neural networks in practice. After reading that chapter and going through the exercises you will have a good understanding of the conjugate gradients and LFGB.
The chapter on how to improve generalization, either by optimizing the structure of the network or by combining multiple classifiers is keep at a intuitive level, yet the concepts are well motivated and the few mathetical details help achieving a solid grasp of why do those ideas work. As in the rest of chapters, it is explained how to carry out it in practice, i.e. how I can proofcheck, if my classifier has become better. At the end of the chapter the reader is familiar with the concept of regularization (weight decay), cross validation and bagging.
 

Review: Neural Networks for Pattern Recognition

User Review  - DJ - Goodreads

intro to neural networks Read full review

Contents

Statistical Pattern Recognition
1
Exercises
28
Exercises
73
Exercises
112
Exercises
161
Exercises
191
Exercises
248
Exercises
292
Preprocessing and Feature Extraction
295
Exercises
329
Exercises
380
Exercises
433
Lagrange Multipliers
448
Index
477
Copyright

Common terms and phrases

References to this book

All Book Search results »

About the author (1995)

Chris Bishop is at Aston University.

Bibliographic information