Analogue Imprecision in MLP Training

Front Cover
World Scientific, 1996 - Computers - 178 pages
Hardware inaccuracy and imprecision are important considerations when implementing neural algorithms. This book presents a study of synaptic weight noise as a typical fault model for analogue VLSI realisations of MLP neural networks and examines the implications for learning and network performance. The aim of the book is to present a study of how including an imprecision model into a learning scheme as a?fault tolerance hint? can aid understanding of accuracy and precision requirements for a particular implementation. In addition the study shows how such a scheme can give rise to significant performance enhancement.
 

What people are saying - Write a review

We haven't found any reviews in the usual places.

Contents

Preface
1
Noise in Neural Implementations
3
Fault Tolerance
81
Generalisation Ability
99
Learning Trajectory and Speed
121
Penalty Terms for Fault Tolerance
137
Conclusions
147
Appendix A Fault Tolerance HintsThe General Case
159
Bibliography
165
Index
173
Copyright

Other editions - View all

Common terms and phrases

Bibliographic information