Limitations and Future Trends in Neural Computation

Front Cover
Sergey Ablameyko
IOS Press, 2003 - Computers - 245 pages
0 Reviews
This book reports critical analyses on complexity issues in the continuum setting and on generalization to new examples, which are two basic milestones in learning from examples in connectionist models. The problem of loading the weights of neural networks, which is often framed as continuous optimization, has been the target of many criticisms, since the potential solution of any learning problem is severely limited by the presence of local minimal in the error function. The maturity of the field requires to convert the quest for a general solution to all learning problems into the understanding of which learning problems are likely to be solved efficiently. Likewise, the notion of efficient solution needs to be formalized so as to provide useful comparisons with the traditional theory of computational complexity in the discrete setting. The book covers these topics focussing also on recent developments in computational mathematics, where interesting notions of computational complexity emerge in the continuum setting.
 

What people are saying - Write a review

We haven't found any reviews in the usual places.

Contents

The Complexity of Computing with Continuous Time Devices
23
EnergyBased Computation with Symmetric Hopfield Nets
45
Computational Complexity and the Elusiveness of Glohal Optima
71
Impact of Neural Networks on Signal Processing and Communications
95
Empirical Risk
115
Leaming Highdimensional Data
141
The Curse of Dimensionality and the Blessing of Multiply Hyhrid
163
TeraOPS Stored
177
Reliahility of ManSystem Interaction and Theory of Neural Networks
216
Author Index
245
Copyright

Common terms and phrases

Bibliographic information