Competitively Inhibited Neural Networks for Adaptive Parameter EstimationArtificial Neural Networks have captured the interest of many researchers in the last five years. As with many young fields, neural network research has been largely empirical in nature, relyingstrongly on simulationstudies ofvarious network models. Empiricism is, of course, essential to any science for it provides a body of observations allowing initial characterization of the field. Eventually, however, any maturing field must begin the process of validating empirically derived conjectures with rigorous mathematical models. It is in this way that science has always pro ceeded. It is in this way that science provides conclusions that can be used across a variety of applications. This monograph by Michael Lemmon provides just such a theoretical exploration of the role ofcompetition in Artificial Neural Networks. There is "good news" and "bad news" associated with theoretical research in neural networks. The bad news isthat such work usually requires the understanding of and bringing together of results from many seemingly disparate disciplines such as neurobiology, cognitive psychology, theory of differential equations, largc scale systems theory, computer science, and electrical engineering. The good news is that for those capable of making this synthesis, the rewards are rich as exemplified in this monograph. |
Contents
The CINN Algorithm | 17 |
The Continuum Model | 33 |
CINN Learning | 49 |
Copyright | |
4 other sections not shown
Other editions - View all
Competitively Inhibited Neural Networks for Adaptive Parameter Estimation Michael Lemmon Limited preview - 2012 |
Competitively Inhibited Neural Networks for Adaptive Parameter Estimation Michael Lemmon No preview available - 2012 |
Common terms and phrases
activated neurons activation width applied input Artificial Neural Networks assumed assumption AVQ algorithm azimuth behaviour bias Cauchy distributed chapter CINN algorithm CINN equations CINN's CINNPE CINNPE1 CINNPE2 estimate function clustering constraints competitive learning computed conservation law constant continuum model defined density function density's derived detection probability differential equation exponential sources external stimulus function Gaussian global search property implementations input vector integral ith neuron K-orthant Kalman filter lemma likelihood function locate LTM clusters LTM space LTM state equation LTM state vectors method of characteristics microrad microradians neural density neural flux neural net neural network noise nonlinear number of neurons observation operator output parallel computers parameter estimation problem point attractor predicted presentation interval primary mode righthand scalar shows sliding threshold test slope source density SSC algorithm SSCTRK switching neurons theorem track hypotheses Track Initiation tracking FOM turn active update variance weight space zero