Neural Networks for Pattern Recognition

Front Cover
Oxford University Press, Nov 23, 1995 - Computers - 482 pages
16 Reviews
This book provides the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts of pattern recognition, the book describes techniques for modelling probability density functions, and discusses the properties and relative merits of the multi-layer perceptron and radial basis function network models. It also motivates the use of various forms of error functions, and reviews the principal algorithms for error function minimization. As well as providing a detailed discussion of learning and generalization in neural networks, the book also covers the important topics of data processing, feature extraction, and prior knowledge. The book concludes with an extensive treatment of Bayesian techniques and their applications to neural networks.
  

What people are saying - Write a review

User ratings

5 stars
7
4 stars
6
3 stars
2
2 stars
0
1 star
1

User Review - Flag as inappropriate

It has been a long way since 1995, and many new techniques and important developments have taken place in the field of A.I. and more concretely, machine learning. Still, this book has aged very well, for two reasons: first, the fundamental techniques and concepts that every practitioner must understand and be able to make use of, like for example parametric techniques for density estimation (kNN), dimensionality reduction (PCA), mixture models, in addition to, of course, neural networks. Second, this book paves the way for moving on to modern techniques like deep energy models and deep belief networks with its last chapter on bayesian techniques.
The explanations are clear and amenable to read. Properties of and advances based on neural networks are presented in a principled way in the context of statistical pattern recognition. The exercises are wisely chosen to ensure the understanding of the presented results, and under what conditions they were derived.
But this book goes beyond theory, A chapter is devoted to optimization techniques, i.e. what algorithms are used to train neural networks in practice. After reading that chapter and going through the exercises you will have a good understanding of the conjugate gradients and LFGB.
The chapter on how to improve generalization, either by optimizing the structure of the network or by combining multiple classifiers is keep at a intuitive level, yet the concepts are well motivated and the few mathetical details help achieving a solid grasp of why do those ideas work. As in the rest of chapters, it is explained how to carry out it in practice, i.e. how I can proofcheck, if my classifier has become better. At the end of the chapter the reader is familiar with the concept of regularization (weight decay), cross validation and bagging.
 

User Review - Flag as inappropriate

This book came out at about the same time as Ripley's, which has almost the same title, but in reverse. At the time, I liked Ripley's better, because it covered more things that were totally new to me. Then a friend said he had chosen Bishop for a course he was teaching, and I went back and reconsidered the two books. I soon found that my friend was right: Bishop's book is better laid out for a course in that it starts at the beginning (well, not quite the beginning--you do need to be fairly sophisticated mathematically) and works up, while Ripley's is more a collection of insights all at the same level; confusing to learn from. Bishop is able to cover both theoretical and practical aspects well. There certainly are topics that aren't covered, but the ones that are there fit together nicely, are accurate and up to date, and are easy to understand. It has migrated from my bookcase to my desk, where it now stays, and I reach for it often. To the reviewer who said "I was looking forward to a detailed insight into neural networks in this book. Instead, almost every page is plastered up with sigma notation", that's like saying about a book on music theory "Instead, almost every page is palstered with black-and-white ovals (some with sticks on the edge)." Or to the reviewer who complains this book is limited to the mathematical side of neural nets, that's like complaining about a cookbook on beef being limited to the carnivore side. If you want a non-technical overview, you can get that elsewhere, but if you want understanding of the techniques, you have to understand the math. Otherwise, there's no beef.  

Contents

Statistical Pattern Recognition
1
Exercises
28
Exercises
73
Exercises
112
Exercises
161
Exercises
191
Exercises
248
Exercises
292
Preprocessing and Feature Extraction
295
Exercises
329
Exercises
380
Exercises
433
Lagrange Multipliers
448
Index
477
Copyright

Common terms and phrases

References to this book

All Book Search results »

About the author (1995)

Chris Bishop is at Aston University.

Bibliographic information