Abstract
we studied the properties of a neural network where the rank of the coupling tensor is greater than two. In this case, in addition to the matrix of synaptic connections, there are presynaptic synapses. Such coupling tensors occur in hardware implementations of neural networks based on crossbar arrays. An inherent property of the crossbar scheme is the presence of parasitic currents: a signal from one wire going to a specific neuron flows through memory cells (synapses) to the wires of all other neurons. As a result, noise is superimposed on the input signal of the neuron, weakening signals going to all other neurons. In this scenario, the conductivity of the analog crossbar cell changes proportionally to the noise signal, causing the output signal of the cell to acquire a nonlinear relationship with the input. We showed that with a specific type of coupling tensor, the quality of the neural network algorithm improved significantly. We used a network similar to the Hopfield network as an example.
References
Pershin Y., Di Ventra M. Experimental Demonstration of Associative Memory with Memristive Neural Networks. Neural Networks. 2010;23(7):881–886.
Jo S. H., Chang T., Ebong I. et al. Nanoscale Memristor Device as Synapse in Neuromorphic Systems. Nanoletters. 2010;10(4):1297–1301.
Truong S. N., Ham S. J., Min K. S. Neuromorphic Crossbar Circuit with Nanoscale FilamentarySwitching Binary Memristors for Speech Recognition. Nanoscale Res. Lett. 2014;9:629–629.
Zhu X., Yang X., Wu C. et al. Hamming Network Circuits Based on CMOS/Memristor Hybrid Design. IEICE Electronics Express. 2013;10(12):20130404-20130404.
Palagushkin A. N., Roshchupkin D. V., Yudkin F. A. et al. Aspects of the a-TiOx Memristor Active Medium Technology. J. Appl. Phys. 2018;124(20):205109.
Морозов А. Ю., Ревизников Д. Л., Абгарян К. К. Известия вузов. Вопросы реализации нейросетевых алгоритмов на мемристорных кроссбарах. Материалы электронной техники. 2019;22(4):272– 278.
Kim H., Mahmoodi M. R., Nili H., Strukov D. B. 4K-memristor Analog-Grade Passive Crossbar Circuit. Nat. Commun. 2021;12(1):5198.
Liu X., Zeng Z. Memristor Crossbar Architectures for Implementing Deep Neural Networks. Complex & Intelligent System. 2022;8:787–802.
Kaneko Y., Nishitani Y., Ueda M. Ferroelectric Artificial Synapses for Recognition of a Multishaded Image. IEEE Trans. Electron Devices. 2014;61:2827–2833.
Boyn S. et al. Learning Through Ferroelectric Domain Dynamics in Solid-State Synapses. Nat. Commun. 2017;8(1):14736.
Romera M. et al. Vowel Recognition with Four Coupled Spin-Torque Nanooscillators. Nature. 2018;563:230–234.
Fuller E. J. et al. Parallel Programming of an Ionic Floating-Gate Memory Array for Scalable Neuromorphic Computing. Science. 2019;364:570–574.
Goswami S. et al. Robust Resistive Memory Devices Using Solution-Processable Metal-Coordinated Azo Aromatics. Nat. Mater. 2017;16:1216–1224.
Indiveri G. et al. Integration of Nanoscale Memristor Synapses in Neuromorphic Computing Architectures. Nanotechnology. 2013;24:384010.
Prezioso M. et al. Training and Operation of an Integrated Neuromorphic Network Based on Metal-Oxide Memristors. Nature. 2015;521:61–64.
Ambrogio S. et al. Neuromorphic Learning and Recognition with One-Transistor-One-Resistor Synapses and Bistable Metal Oxide RRAM. IEEE Trans. Electron Devices. 2016;63:1508–1515.
Adam G. C. et al. 3-D Memristor Crossbars for Analog and Neuromorphic Computing Applications. IEEE Trans. Electron Devices. 2017;64:312–318.
Hu M. et al. Memristor-Based Analog Computation and Neural Network Classification with a Dot Product Engine. Adv. Mater. 2018;30:1705914.
Merrikh B. F. et al. Implementation of Multilayer Perceptron Network with Highly Uniform Passive Memristive Crossbar Circuits. Nat. Commun. 2018;9(1).
Chakrabartty S., Cauwenberghs G. Sub-Microwatt Analog VLSI Trainable Pattern Classifier. IEEE J. Solid-State Circuits. 2007;42:1169–1179.
Ramakrishnan S., Hasler J. Vector-Matrix Multiply and Winner-Take-All as an Analog Classifier. IEEE Trans. Very Large Scale Integr. Syst. 2014;22:353–361.
Merrikh B. F. et al. High-Performance Mixed-Signal Neurocomputing with Nanoscale Floating-Gate Memory Cells. IEEE Trans. Neural Netw. Learn. Syst. 2018;29:4782–4790.
Eryilmaz S. B. et al. Brain-Like Associative Learning Using a Nanoscale Nonvolatile Phase Change Synaptic Device Array. Front. Neurosci. 2014;8:205.
Burr G. W. et al. Experimental Demonstration and Tolerancing of a Large-Scale Neural Network (165 000 Synapses) Using Phase-Change Memory as the Synaptic Weight Element. IEEE Trans. Electron Devices. 2015;62:3498–3507.
Boybat I. et al. Neuromorphic Computing with Multi-Memristive Synapses. Nat. Commun. 2018;9(1).
Hopfield J. J. Neural Networks and Physical Systems with Emergent Collective Computational Abilities. Proc. Nat. Acad. Sci. USA. 1982;79:2554-2558.
Amit D. J., Gutfreund H., Sompolinsky H. Storing Infinite Numbers of Patterns in a Spin-Glass Model of Neural Networks. Physical Review Letters. 1985;55(14):1530–1533.
Karandashev I., Kryzhanovsky B., Litinskii L. Weighted Patterns as a Tool to Improve the Hopfield Model. Physical Review E. 2012;85:041925.
Kryzhanovsky B. V., Kryzhanovsky V. M., Mikaelian A. L. et al. Parametric Dynamic Neural Network Recognition Power. Optical Memory & Neural Network. 2001;10(4):211-218.
Kryzhanovsky B. V., Mikaelian A. L. On the Recognition Ability of a Neural Network on Neurons with Parametric Transformation of Frequencies. Doklady Mathematics. 2002;65(2):286-288.
Kryzhanovsky B., Litinskii L., Fonarev A. An Effective Associative Memory for Pattern Recognition. Lecture Notes in Computer Science. 2003;2810:179-186.
Kryzhanovsky B. V., Kryzhanovsky V. M. Modified q-state Potts Model with Binarized Synaptic Coefficients. Lecture Notes in Computer Science. 2008;5164:72-80.
Karandashev I., Kryzhanovsky B., Litinskii L. Properties of the Hopfield Model with Weighted Patterns. Lecture Notes in Computer Science. 2012;7552:41-48.
Agliari E., Albanese L., Alemanno F. et al. Dense Hebbian Neural Networks: A Replica Symmetric Picture of Supervised Learning. Physica A: Statistical Mechanics and Its Applications. 2023;626:129076.
Barra A., Bernacchia A., Santucci E., Contucci P. On the Equivalence of Hopfield Networks and Boltzmann Machines. Neural Networks. 2012;34:1–9.
Bertram R., Rubin J. E. Multi-Timescale Systems and Fast-Slow Analysis. Mathematical Biosciences. 2017;287:105–121.
Fanaskov V., Oseledets I. Associative Memory and Dead Neurons. arXiv:2410.13866.
Hinton G. E. A Practical Guide to Training Restricted Boltzmann Machines. Neural Networks: Tricks of the Trade. Springer-Verlag Berlin Heidelberg; 2012:599–619.
Hoover B., Chau D. H., Strobelt H. et al. A Universal Abstraction for Hierarchical Hopfield Networks. The Symbiosis of Deep Learning and Differential Equations II. NeurIPS; 2022.
Krotov D., Hopfield J. Dense Associative Memory is Robust to Adversarial Inputs. Neural Computation. 2018;30(12):3151–3167.
Hebb D. O. The Organization of Behavior. A Neuropsychological Theory. NY: Wiley; 1949.
Экклс Д. К. Физиология нервных клеток. М.: ИЛ; 1959.