Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Mean field theory neural networks for feature recognition, content addressable memory and optimization

Peterson, Carsten LU (1991) In Connection Science 3(1). p.3-33
Abstract
Various applications of the mean field theory (MFT) technique for obtaining solutions close to optimal minima in feedback networks are reviewed. Using this method in the context of the Boltzmann machine gives rise to a fast deterministic learning algorithm with a performance comparable with that of the backpropagation algorithm (BP) in feature recognition applications. Since MFT learning is bidirectional its use can be extended from purely functional mappings to a content addressable memory. The storage capacity of such a network grows like O (10–20)nH with the number of hidden units. The MFT learning algorithm is local and thus it has an advantage over BP with respect to VLSI implementations. It is also demonstrated how MFT and BP are... (More)
Various applications of the mean field theory (MFT) technique for obtaining solutions close to optimal minima in feedback networks are reviewed. Using this method in the context of the Boltzmann machine gives rise to a fast deterministic learning algorithm with a performance comparable with that of the backpropagation algorithm (BP) in feature recognition applications. Since MFT learning is bidirectional its use can be extended from purely functional mappings to a content addressable memory. The storage capacity of such a network grows like O (10–20)nH with the number of hidden units. The MFT learning algorithm is local and thus it has an advantage over BP with respect to VLSI implementations. It is also demonstrated how MFT and BP are related in situations where the number of input units is much larger than the number of output units. In the context of-finding good solutions to difficult optimization problems the MFT technique again turns out to be extremely powerful. The quality of the solutions for large travelling salesman and graph partition problems are in parity with those obtained by optimally tuned simulated annealing methods. The algorithm employed here is based on multistate K-valued (K > 2) neurons rather than binary (K = 2) neurons. This method is also advantageous for more nested decision problems like scheduling. The MFT equations are isomorfic to resistance-capacitance equations and hence naturally map onto custom-made hardware. With the diversity of successful application areas the MFT approach thus constitutes a convenient platform for hardware development. (Less)
Please use this url to cite or link to this publication:
author
organization
publishing date
type
Contribution to journal
publication status
published
subject
in
Connection Science
volume
3
issue
1
pages
30 pages
publisher
Taylor and Francis A.S.
ISSN
0954-0091
DOI
10.1080/09540099108946571
language
English
LU publication?
yes
id
c8c14eb7-b469-429a-9d86-3920218afa8f
date added to LUP
2019-05-31 16:23:59
date last changed
2025-04-04 14:39:53
@article{c8c14eb7-b469-429a-9d86-3920218afa8f,
  abstract     = {{Various applications of the mean field theory (MFT) technique for obtaining solutions close to optimal minima in feedback networks are reviewed. Using this method in the context of the Boltzmann machine gives rise to a fast deterministic learning algorithm with a performance comparable with that of the backpropagation algorithm (BP) in feature recognition applications. Since MFT learning is bidirectional its use can be extended from purely functional mappings to a content addressable memory. The storage capacity of such a network grows like O (10–20)nH with the number of hidden units. The MFT learning algorithm is local and thus it has an advantage over BP with respect to VLSI implementations. It is also demonstrated how MFT and BP are related in situations where the number of input units is much larger than the number of output units. In the context of-finding good solutions to difficult optimization problems the MFT technique again turns out to be extremely powerful. The quality of the solutions for large travelling salesman and graph partition problems are in parity with those obtained by optimally tuned simulated annealing methods. The algorithm employed here is based on multistate K-valued (K > 2) neurons rather than binary (K = 2) neurons. This method is also advantageous for more nested decision problems like scheduling. The MFT equations are isomorfic to resistance-capacitance equations and hence naturally map onto custom-made hardware. With the diversity of successful application areas the MFT approach thus constitutes a convenient platform for hardware development.}},
  author       = {{Peterson, Carsten}},
  issn         = {{0954-0091}},
  language     = {{eng}},
  number       = {{1}},
  pages        = {{3--33}},
  publisher    = {{Taylor and Francis A.S.}},
  series       = {{Connection Science}},
  title        = {{Mean field theory neural networks for feature recognition, content addressable memory and optimization}},
  url          = {{http://dx.doi.org/10.1080/09540099108946571}},
  doi          = {{10.1080/09540099108946571}},
  volume       = {{3}},
  year         = {{1991}},
}