## Neural Networks: A Comprehensive FoundationIt examines all the important aspects of this emerging technolgy, covering the learning process, back propogation, radial basis functions, recurrent networks, self-organizing systems, modular networks, temporal processing, neurodynamics, and VLSI implementation. New chapters delve into such areas as support vector machines, and reinforcement learning/neurodynamic programming, plus readers will find an entire chapter of case studies to illustrate the real-life, practical applications of neural networks. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

Introduction | 1 |

Z Learning Processes | 50 |

Single Layer Perceptrons | 117 |

Copyright | |

13 other sections not shown

### Other editions - View all

### Common terms and phrases

activation function adaptive applied approximation attractor back-propagation back-propagation algorithm Boltzmann machine Chapter classification computation condition convergence corresponding cost function defined denote derivative described in Eq desired response dimensionality discussed distribution dynamic eigenvalue entropy equation estimate example experts feature map feedback FIGURE filter follows Gaussian gradient graph Hebbian Hessian matrix hidden layer hidden neurons HME model Hopfield Hopfield network input layer input space input vector iteration leaming learning-rate parameter linear LMS algorithm Lyapunov method minimization multilayer perceptron mutual information neural network neuron nodes noise nonlinear operation optimal output layer output neuron pattem pattern performance principal components principal components analysis probability density function problem radial-basis function random variable RBF network recurrent network represents respect result Section self-organizing sigmoid statistical stochastic supervised learning support vector machine theorem theory tion training data training sample training set update variance VC dimension zero