ARTIFICIAL NEURAL NETWORKS
Designed as an introductory level textbook on Artificial Neural Networks at the postgraduate and senior undergraduate levels in any branch of engineering, this self-contained and well-organized book highlights the need for new models of computing based on the fundamental principles of neural networks. Professor Yegnanarayana compresses, into the covers of a single volume, his several years of rich experience, in teaching and research in the areas of speech processing, image processing, artificial intelligence and neural networks. He gives a masterly analysis of such topics as Basics of artificial neural networks, Functional units of artificial neural networks for pattern recognition tasks, Feedforward and Feedback neural networks, and Archi-tectures for complex pattern recognition tasks. Throughout, the emphasis is on the pattern processing feature of the neural networks. Besides, the presentation of real-world applications provides a practical thrust to the discussion.
What people are saying - Write a review
BASICS OF ARTIFICIAL NEURAL NETWORKS 1539
ACTIVATION AND SYNAPTIC DYNAMICS 4075
FUNCTIONAL UNITS OF ANN FOR PATTERN
FEEDFORWARD NEURAL NETWORKS 88141
FEEDBACK NEURAL NETWORKS 142200
COMPETITIVE LEARNING NEURAL NETWORKS 201232
activation dynamics activation value applications architecture artificial neural networks associative memory autoassociative backpropagation backpropagation learning binary biological neural network Boltzmann learning Boltzmann machine competitive learning computing connections constraint convergence corresponding desired output discussed energy landscape equations error surface external input feature mapping feedback layer feedback network feedforward network feedforward neural network fuzzy gradient descent Hebbian learning hidden layer hidden units Hopfield IEEE input data input layer input pattern input vector input-output pattern pairs instar ith unit learning algorithm learning law learning rate parameter linear methods minima neural network models neuron nonlinear output function output layer output pattern output units pattern association pattern mapping pattern recognition tasks pattern storage perceptron learning performance pixel principal component probability distribution processing units radial basis function random recall shown in Figure simulated annealing stable supervised learning synaptic dynamics temperature term training set VC dimension weight matrix weight vector
Page 423 - Neurocomputing," in Artificial Neural Networks: Paradigms, Applications, and Hardware Implementations, E. SanchezSinencio and C. Lau, Eds., IEEE Press, 1992, pp 344-363.  E. Vittoz, H. Oguey, MA Maher, O. Nys, E. Dijkstra, and M. Chevroulet, "Analog Storage of Adjustable Synaptic Weights," in VLSI Design of Neural Networks, Norwell MA: Kluwer Academic, pp 47-63, 1991.