Hopfield Type Phase Neural Network
PDF (Russian)

Keywords

phase neuron
associative memory
Hebbian learning rule
recognition

How to Cite

1.
Kryzhanovsky B.V. Hopfield Type Phase Neural Network // Russian Journal of Cybernetics. 2024. Vol. 5, № 2. P. 8-12. DOI: 10.51790/2712-9942-2024-5-2-01.

Abstract

this study examines the properties of a fully connected neural network composed of phase neurons, following the Hebbian learning rule. The signals transmitted through the network’s interconnections are single pulses with specific phases. A neuron’s firing rule is defined as follows: among the total signals received at a neuron’s input, the phase component with the highest amplitude is identified, and the neuron emits a single pulse with the same phase. The phases encoding the components of associative memory vectors are randomly distributed. To estimate the recognition error, we employ the Chernov-Chebyshev technique, which is independent of the phase encoding distribution type. Our findings demonstrate that the associative memory capacity of this neural network is four times greater than that of a traditional Hopfield network that operates with binary patterns. Consequently, the radius of the attraction region is also four times larger.

https://doi.org/10.51790/2712-9942-2024-5-2-01
PDF (Russian)

References

Hopfield J. Neural Networks and Physical Systems with Emergent Collective Computational Abilities. Proc. Nat. Acad. Sci. USA. 1982;79:2554–2558.

Hebb D. The Organization of Behavior. New York: Wiley; 1949. 365 р.

Palm G., Sommer F. Information Capacity in Recurrent McCulloch-Pitts Networks with Sparsely Coded Memory States. Network. 1992;3:177–186.

Kuh A., Dickinson B. Information Capacity of Associative Memory. IEEE Trans. on Inf. Theory. 1989;35:59–68.

Herz A., Marcus C. Distributed Dynamics in Neural Networks. Phys. Rev. E. 1993;47:2155–2161.

McEllice R., Posner E. et al. Capacity of the Hopfield Associative Memory. IEEE Trans. on Inf. Theory. 1987;33:461–482.

Bolle D., Dupont P., Mourik J. van. Stability Properties of Potts Neural Networks with Biased Pattern and Low Loading. J. Phys. A. 1991;24:1065.

Kazanovich Y., Borisyuk R. Dynamics of Neural Networks with Central Element. Neural Networks. 1999;12:441–454.

Kryzhanovsky B. V., Magomedov B. M., Mikaelian A. L. A Domaine Model of Neural Network. Doklady Mathematics. 2005;71:310–314.

Vogt H., Zippelius A. Invariant Recognition in Potts-Glass Neural Network. J. Phys. A. 1992;25:2209.

Sompolinsky H. Neural Network with Non-linear Synapses and Static Noise. Phys. Rev. A. 1986;34:2571.

Крыжановский Б. В., Микаэлян А. Л. О распознающей способности нейросети на нейронах с параметрическим преобразованием частот. Доклады АН. Сер. Мат. физика. 2002;383(3):318–321.

Крыжановский Б. В., Микаэлян А. Л. Ассоциативная память, способная распознавать сильно скоррелированные образы. Доклады АН. Сер. Информатика. 2003;390(1):27–31.

Mikaelian A. L., Kryzhanovsky B. V., Litinskii L. B. Parametrical Neural Network. Optical Memory&Neural Network. 2003;12(3):227–236.

Kryzhanovsky B. V., Litinskii L. B., Mikaelian A. L. Vector-Neuron Models of Associative Memory. 2004 IEEE International Joint Conference on Neural Networks. Budapest; 2004. Р. 909–1004.

Kanter I. Potts-Glass Models of Neural Networks. Phys. Rev. A. 1988;37(7):2739.

Bolle D., Dupont P., Huyghebaert J. Thermodynamic Properties of the q-State Potts-Glass Neural Network. Phys. Rev. A. 1992;45:4194.

Wu F. Y. The Potts Model. Rev. Mod. Phys. 1982;54:235.

Amit D. J., Gutfreund H., Sompolinsky H. Information Storage in Neural Networks with Low Levels of Activity. Phys. Rev. A. 1987;35:2293.

Karandashev I., Kryzhanovsky B., Litinskii L. Weighted Patterns as a Tool to Improve the Hopfield Model. Physical Review E. 2012;85:041925.

Downloads

Download data is not yet available.