Feedforward Neural Network Methodology
The decade prior to publication has seen an explosive growth in com- tational speed and memory and a rapid enrichment in our understa- ing of arti?cial neural networks. These two factors have cooperated to at last provide systems engineers and statisticians with a working, prac- cal, and successful ability to routinely make accurate complex, nonlinear models of such ill-understood phenomena as physical, economic, social, and information-based time series and signals and of the patterns h- den in high-dimensional data. The models are based closely on the data itself and require only little prior understanding of the stochastic mec- nisms underlying these phenomena. Among these models, the feedforward neural networks, also called multilayer perceptrons, have lent themselves to the design of the widest range of successful forecasters, pattern clas- ?ers, controllers, and sensors. In a number of problems in optical character recognition and medical diagnostics, such systems provide state-of-the-art performance and such performance is also expected in speech recognition applications. The successful application of feedforward neural networks to time series forecasting has been multiply demonstrated and quite visibly so in the formation of market funds in which investment decisions are based largely on neural network–based forecasts of performance. The purpose of this monograph, accomplished by exposing the meth- ology driving these developments, is to enable you to engage in these - plications and, by being brought to several research frontiers, to advance the methodology itself.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Alternative SingleNeuron Models
Choice of Node Functions
Backpropagation Algorithm for Gradient Evaluation
Conjugate Gradient Algorithms
Remarks on Computing the Hessian
zhitecture Selection and Penalty Terms
neralization and Learning
Asymptotic Distribution of Training Algorithm Parameter
te on Use as a Text
Other editions - View all
achieved applications approximation approximation error arbitrarily closely architecture assume Assumption asymptotic backpropagation behavior Boolean Boolean functions Chapter choice classiﬁcation complexity components computational conjugate gradient continuous functions convergence convex convex combination cross-validation deﬁned denote dichotomies eigenvalues elements empirical error error function evaluate feedforward ﬁnal ﬁnd ﬁnite ﬁrst layer ﬁxed given Hence Hessian hidden layer network hyperplane implement inﬁnite iteration learning rate line search linear linear subspace linearly separable logistic MATLAB matrix methods minimization minimum network parameters neural networks neuron node functions nonlinear optimal output node parameter values parameter vector perceptron points positive deﬁnite probability problem quadratic quasi-Newton random variables real-valued sample Section sigmoidal sigmoidal function signiﬁcant single hidden layer space speciﬁed steepest descent subset sup-norm Theorem threshold tion training algorithm training data training set upper bound validation error Vapnik VC capacity VC dimension vector Q weight vector