Neural Networks: A Comprehensive Foundation

Front Cover
Prentice Hall, 1999 - Computers - 842 pages
Introduction; Learning processes; Single layer perceptrons; Multilayer perceptrons; Radial-basis function networks; Support vector machines; Comittee machines; Principal components analysis; Self-organizing maps; Information-theoretic models; Stochastic machines and their approximates rooted in statistical mechanics; neurodynamic programming; Temporal processing using feedforward networks; Neurodynamics; Dynamically driven recurrent networks; Epilogue; Bibliography; Index.

From inside the book

Contents

Introduction
1
Committee Machines 351
3
Learning Processes
50
Copyright

15 other sections not shown

Other editions - View all

Common terms and phrases

Bibliographic information