Neural Network Models: Theory and Projects
Springer Science & Business Media, May 30, 1997 - Technology & Engineering - 174 pages
Providing an in-depth treatment of neural network models, this volume explains and proves the main results in a clear and accessible way. It presents the essential principles of nonlinear dynamics as derived from neurobiology, and investigates the stability, convergence behaviour and capacity of networks. Also included are sections on stochastic networks and simulated annealing, presented using Markov processes rather than statistical physics, and a chapter on backpropagation. Each chapter ends with a suggested project designed to help the reader develop an integrated knowledge of the theory, placing it within a practical application domain. Neural Network Models: Theory and Projects concentrates on the essential parameters and results that will enable the reader to design hardware or software implementations of neural networks and to assess critically existing commercial products.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Neurons in the Brain
Other editions - View all
action potential Angiograms applications of neural arteries associative memory axon backpropagation algorithm behaviour bitstring Boolean functions brain calculate the weights chapter classify connections correlations dendrites denoted derived digit encode energy function error signal example feedback feedforward network function sgn fundamental memories gradient descent hidden layer hidden neurons Hopfield networks hypercube Hyphenation algorithms implement influences neuron input and output input neurons interconnection layer network learning rate macroscopic means that neuron molecules multi-layer network converges network in figure neural net neural networks neurons will represent nodes noise nonlinear function number of fundamental number of hidden number of layers number of neurons operate oscillating output layer output neurons parallel parity bit pattern recognition pixel possible potential preprocessing problem processors prototypes recognize scores semantic network space Spin Glass statistical studied synaptic synaptic bouton tanh three layers threshold trajectories update rule 1.1 vector weight wnj zero