## Nonlinear Dynamical Systems: Feedforward Neural Network PerspectivesThe first truly up-to-date look at the theory and capabilities of nonlinear dynamical systems that take the form of feedforward neural network structures Considered one of the most important types of structures in the study of neural networks and neural-like networks, feedforward networks incorporating dynamical elements have important properties and are of use in many applications. Specializing in experiential knowledge, a neural network stores and expands its knowledge base via strikingly human routes-through a learning process and information storage involving interconnection strengths known as synaptic weights. In Nonlinear Dynamical Systems: Feedforward Neural Network Perspectives, six leading authorities describe recent contributions to the development of an analytical basis for the understanding and use of nonlinear dynamical systems of the feedforward type, especially in the areas of control, signal processing, and time series analysis. Moving from an introductory discussion of the different aspects of feedforward neural networks, the book then addresses: * Classification problems and the related problem of approximating dynamic nonlinear input-output maps * The development of robust controllers and filters * The capability of neural networks to approximate functions and dynamic systems with respect to risk-sensitive error * Segmenting a time series It then sheds light on the application of feedforward neural networks to speech processing, summarizing speech-related techniques, and reviewing feedforward neural networks from the viewpoint of fundamental design issues. An up-to-date and authoritative look at the ever-widening technical boundaries and influence of neural networks in dynamical systems, this volume is an indispensable resource for researchers in neural networks and a reference staple for libraries. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Common terms and phrases

acoustic feature vectors algorithm annealing causal Cepstrum change point classification error codebook condition continuous CUSUM test data set decision function defined denote detection dynamical systems elements error objective estimate example feedforward filter gate Gaussian given hidden Markov model hybrid IEEE Trans input input-output maps Katagiri KL divergence lemma likelihood log-likelihood ratio Markov mean square error memory depth memoryless mixture model mixture of experts MLPWOF module multilayer perceptron neural networks neurons node nonlinear norm optimization output parameters piecewise stationary positive integer posterior probabilities predictors prior probabilities problem proof random process real numbers real-valued functions recursive regimes risk-sensitive samples Sandberg Section segmentation sequence shift-invariant shown in Fig sigmoidal Signal Processing space speaker recognition speech processing speech recognition speech signal squared error Stone-Weierstrass theorem structure subset switching theorem tion Volterra series

### Popular passages

Page v - Our main result is a theorem that gives, in a certain setting, a necessary and sufficient condition under which discrete-space multidimensional shift-invariant input-output maps with vector-valued inputs drawn from a certain large set can be uniformly approximated arbitrarily well, using a structure consisting of a linear preprocessing stage followed by a memoryless nonlinear network.