Neural networks: a comprehensive foundation
This book represents the most comprehensive treatment available of neural networks from an engineering perspective. Thorough, well-organized, and completely up to date, it examines all the important aspects of this emerging technology, including the learning process, back-propagation learning, radial-basis function networks, self-organizing systems, modular networks, temporal processing and neurodynamics, and VLSI implementation of neural networks. Written in a concise and fluid manner, by a foremost engineering textbook author, to make the material more accessible, this book is ideal for professional engineers and graduate students entering this exciting field. Computer experiments, problems, worked examples, a bibliography, photographs, and illustrations reinforce key concepts.
88 pages matching nonlinear in this book
Results 1-3 of 88
What people are saying - Write a review
We haven't found any reviews in the usual places.
Correlation Matrix Memory
15 other sections not shown
activation function adaptive applied approximation back-propagation algorithm back-propagation learning Boltzmann machine Chapter classifier computation condition content-addressable memory convergence correlation matrix corresponding cost function defined denote desired response distribution dynamical eigenvalue eigenvectors entropy equation error signal error surface expert networks feature map feedback feedforward feedforward network FIGURE filter follows fundamental memory Gaussian gradient Hebbian Hebbian learning hidden layer hidden neurons Hopfield network input layer input patterns input space input-output iterations learning algorithm learning process learning rule learning-rate parameter Liapunov function linear LMS algorithm mean-field-theory modular network multilayer perceptron mutual information neural network nodes noise nonlinear operation output layer output neurons performance principal components analysis probability probability density function problem pseudoinverse radial-basis functions RBF network recurrent network represents result risk functional Section self-organizing shown in Fig signal-flow graph SOFM algorithm stable statistical stochastic supervised learning synaptic weight vector theorem training set update VC dimension weight matrix zero