Since the second edition of this book came out in early 1997, the number of scientific papers published on the Self-Organizing Map (SOM) has increased from about 1500 to some 4000. Also, two special workshops dedicated to the SOM have been organized, not to mention numerous SOM sessions in neural network conferences. In view of this growing interest it was felt desirable to make extensive revisions to this book. They are of the following nature. Statistical pattern analysis has now been approached more carefully than earlier. A more detailed discussion of the eigenvectors and eigenvalues of symmetric matrices, which are the type usually encountered in statistics, has been included in Sect. 1.1.3: also, new probabilistic concepts, such as factor analysis, have been discussed in Sect. 1.3.1. A survey of projection methods (Sect. 1.3.2) has been added, in order to relate the SOM to classical paradigms. Vector Quantization is now discussed in one main section, and derivation of the point density of the codebook vectors using the calculus of variations has been added, in order to familiarize the reader with this otherwise com plicated statistical analysis. It was also felt that the discussion of the neural-modeling philosophy should include a broader perspective of the main issues. A historical review in Sect. 2.2, and the general philosophy in Sects. 2.3, 2.5 and 2.14 are now expected to especially help newcomers to orient themselves better amongst the profusion of contemporary neural models.
What people are saying - Write a review
We haven't found any reviews in the usual places.
The Basic SOM
in the Output Space
Physiological Interpretation of SOM
Variants of SOM
Learning Vector Quantization 245
Other editions - View all
accuracy adaptive algorithm analysis applications approximation array Artificial Neural Networks ASSOM basic basis vectors brain cells classification clustering codebook vectors components computing convergence corresponding defined denoted density function described dimensionality distance dot product dynamic elements equation Euclidean feature filters IEEE Service Center input data input samples input vector Joint Conf Kohonen labels lattice learning Levenshtein distance linear linear subspace map units mathematical matrix method neighborhood function neighboring Networks IEEE Service Neural Networks IEEE neuron nodes nonlinear operation optimal orthogonal output parameters pattern recognition phoneme Piscataway probability density function problem Proc projection quantization error random reference vectors respectively scalar Self-Organizing Map sequence Signal Processing Simula simulation speech speech recognition Springer statistical step stochastic strings subset subspace supervised learning symbol synaptic tion topology transformation two-dimensional values variables vector quantization Voronoi Voronoi tessellation wavelets whereby winner zero