The first edition, published in 1973, has become a classic reference in the field. Now with the second edition, readers will find information on key new topics such as neural networks and statistical pattern recognition, the theory of machine learning, and the theory of invariances. Also included are worked examples, comparisons between different methods, extensive graphics, expanded exercises and computer project topics.
An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.
25 pages matching component classifiers in this book
Results 1-3 of 25
What people are saying - Write a review
MAXIMUMLIKELIHOOD AND BAYESIAN
4 NONPARAMETRIC TECHNIQUES
15 other sections not shown
Other editions - View all
annealing applied approach arbitrary assume backpropagation Bayes Bayesian bias binary calculate Chapter clusters component classifiers Computer exercise configuration Consider convergence corresponding covariance matrix criterion function data set decision boundary decision rule denote derivation dimensional dimensions discriminant function distance distribution entropy equation error rate example feature space FIGURE Gaussian given gradient descent grammar Hessian matrix Hidden Markov Models hidden units hyperplane impurity independent iteration labeled large number learning rate linear discriminant linearly separable maximum-likelihood estimate mean methods minimize minimum mixture density nearest-neighbor neural networks node nonlinear normal number of samples obtain optimal output units P(co parameters particular pattern recognition Perceptron posterior posterior probabilities prior probabilities problem procedure randomly Section sequence Show shown simple solution split statistical statistically independent stochastic Suppose Theorem tion training data training error training patterns training samples training set tree two-category unsupervised learning variance weight vector zero