Hopfield network

A Hopfield network (Ising model of a neural network or Ising–Lenz–Little model or Amari-Little-Hopfield network) is a spin glass system used to model neural networks, based on Ernst Ising's work with Wilhelm Lenz on the Ising model of magnetic materials.[1] Hopfield networks were first described with respect to recurrent neural networks by Shun'ichi Amari in 1972[2][3] and with respect to biological neural networks by William Little in 1974,[4] and were popularised by John Hopfield in 1982.[5] Hopfield networks serve as content-addressable ("associative") memory systems with binary threshold nodes, or with continuous variables.[6] Hopfield networks also provide a model for understanding human memory.[7][8]

  1. ^ Brush, Stephen G. (1967). "History of the Lenz-Ising Model". Reviews of Modern Physics. 39 (4): 883–893. Bibcode:1967RvMP...39..883B. doi:10.1103/RevModPhys.39.883.
  2. ^ Amari, Shun-Ichi (1972). "Learning patterns and pattern sequences by self-organizing nets of threshold elements". IEEE Transactions. C (21): 1197–1206.
  3. ^ Schmidhuber, Juergen (2022). "Annotated History of Modern AI and Deep Learning". arXiv:2212.11279 [cs.NE].
  4. ^ Little, W. A. (1974). "The Existence of Persistent States in the Brain". Mathematical Biosciences. 19 (1–2): 101–120. doi:10.1016/0025-5564(74)90031-5.
  5. ^ Hopfield, J. J. (1982). "Neural networks and physical systems with emergent collective computational abilities". Proceedings of the National Academy of Sciences. 79 (8): 2554–2558. Bibcode:1982PNAS...79.2554H. doi:10.1073/pnas.79.8.2554. PMC 346238. PMID 6953413.
  6. ^ Cite error: The named reference :0 was invoked but never defined (see the help page).
  7. ^ Amit, D.J. (1992). Modeling Brain Function: The World of Attractor Neural Networks. Cambridge University Press. ISBN 978-0-521-42124-9.
  8. ^ Rolls, Edmund T. (2016). Cerebral Cortex: Principles of Operation. Oxford University Press. ISBN 978-0-19-878485-2.

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search