Higher Order Artificial Neural Networks

Front Cover
DIANE Publishing, Jun 1, 1990 - Computers - 25 pages
0 Reviews
An investigation of the storage capacity of an artificial neural network where the state of each neuron depends on quadratic correlations of all other neurons, i.e. a third order network. Graphs. Bibliography.
 

What people are saying - Write a review

We haven't found any reviews in the usual places.

Selected pages

Common terms and phrases

Popular passages

Page 8 - Unfortunately, a higher order network is not isomorphic to any simple physical system, which complicates the analysis. The assumption that the state of each neuron depends on correlations of all other neurons is fairly "unphysical" and methods from statistical physics are more difficult to apply. We do therefore not make an analysis in this direction. Instead a simplified treatment and approximate calculations are made whenever they enlighten the discussion.
Page 22 - The inputs and output functions of neurons are also separated. between an increase in the number of connections and the number of patterns which can be stored. It is therefore interesting to compare the storage capacity divided by the number of connections for the cases of the original Hopfield and the 3'rd order network. • Standard Hopfield : ^ = • 3'rd order network : ^ff w 2j2§ Both numbers are identical within the error limits.
Page 22 - It seems therefore that there exists some kind of universality. The number of patterns which may be stored per connection, keeping an acceptable recall rate, is independent on the order of the network. If one trusted a pure statistical analysis similar to the one made in eqs. (5,6), this universality would have been anticipated.
Page 9 - The equation of motion clearly illustrates the problem with a 3'rd order network. A double sum needs to be computed for each updating of every single neuron. In a serial computer this means that the cpu time for a single neuron updating is of O(N2), and for the complete lattice O(N3).
Page 17 - A slight overestimation is made. It must be noted that the corresponding analysis of the original Hopfield model gives also a Tc (= 1) which is an overestimation with roughly the same factor. The nontrivial fix point disappears discontinously when T grows larger than Tc.
Page 8 - Despite the large number of connections, it is sometimes valuable to be able to store many patterns on a limited set of neurons. Many applications favour the increased storage capacity, and the larger complexity needed in a circuit representation of a 3'rd order network may not be of decisive importance.
Page 12 - A slightly more detailed analysis, which has previously been made for the original llopfield model, can also be made here. The analysis is a pure statistical one. which does not correctly treat cross-talk between different patterns and therefore gives a false answer. As will be demonstrated, something can be learnt anyway. As in eq. (">) above, the local lield //; may be divided into a memory term and a noise term as /»,=£,"(»'")' + '/ (») where m
Page 15 - In the numerical analysis presented below, the binary stochastic ,S"s are governed by the metropolis algorithm at nonzero temperatures. One obvious reason is that the average energy is difficult to compute in mean field theory since / r*\ ^"^ ii...
Page 8 - Wj-jt, is completely symmetric with zeros in the diagonal planes. Note that the twofold degeneracy of the original Hopfield model, which corresponds to a global spin flip of every spin, has been removed...
Page 17 - V"s as cont .incus dynamical variables and eq. ( 12) as the equation of motion, a peculiar "hysteresis" effect is revealed. If T is increased from a low value, well below Tc, and a nonzero...

References to this book

Bibliographic information