Mathematical Methods for Neural Network Analysis and Design
This graduate-level text teaches students how to use a small number of powerful mathematical tools for analyzing and designing a wide variety of artificial neural network (ANN) systems, including their own customized neural networks.
Mathematical Methods for Neural Network Analysis and Design offers an original, broad, and integrated approach that explains each tool in a manner that is independent of specific ANN systems. Although most of the methods presented are familiar, their systematic application to neural networks is new. Included are helpful chapter summaries and detailed solutions to over 100 ANN system analysis and design problems. For convenience, many of the proofs of the key theorems have been rewritten so that the entire book uses a relatively uniform notion.
This text is unique in several ways. It is organized according to categories of mathematical tools—for investigating the behavior of an ANN system, for comparing (and improving) the efficiency of system computations, and for evaluating its computational goals— that correspond respectively to David Marr's implementational, algorithmic, and computational levels of description. And instead of devoting separate chapters to different types of ANN systems, it analyzes the same group of ANN systems from the perspective of different mathematical methodologies.
A Bradford Book
What people are saying - Write a review
We haven't found any reviews in the usual places.
Deterministic Nonlinear Dynamical Systems Analysis
Stochastic Nonlinear Dynamical Systems Analysis
Nonlinear Optimization Theory
Rational Inference Measures
activation level activation pattern analysis and design ANN classification dynamical ANN dynamical system ANN system Assume asymptotic backpropagation backpropagation learning backpropagation network Boltzmann machine bounded chapter classification dynamical system condition connection strength parameter converges with probability defined descent algorithm dimensional equilibrium point fuzzy measure Gaussian Gibbs sampler given global minimum gradient Hebbian learning Hessian hidden units Hopfield identically distributed independent and identically input units Invariant Set Theorem iteration learning algorithm learning dynamical system learning objective function Lyapunov function Markov random field matrix neural network neuron nonlinear Note optimization output unit positive real number probability distribution probability mass density probability mass function probability model problem random variables random vector relational system respect risk function sample risk function sample space Shanno algorithm stepsize Stochastic Approximation stochastic process stochastic sequence strength parameter vector strict local minimum strictly positive supervised learning th element tion training stimulus trajectories updating Wald test weight zero
Page 389 - Cowey, A. (1981). Why are there so many visual areas? In FO Schmitt, FG Worden, G. Adelman and SG Dennis (Eds), The Organisation of the Cerebral Cortex.
Page 387 - T. and Sveen, O. (1969). Participation of inhibitory and excitatory interneurons in the control of hippocampal cortical output. In The Interneuron (MAB Brazier, Ed.), pp.