Generalized Hopfield Networks and Nonlinear Optimization

Part of Advances in Neural Information Processing Systems 2 (NIPS 1989)

Bibtex Metadata Paper

Authors

Gintaras Reklaitis, Athanasios Tsirukis, Manoel Tenorio

Abstract

A nonlinear neural framework, called the Generalized Hopfield network, is proposed, which is able to solve in a parallel distributed manner systems of nonlinear equations. The method is applied to the general nonlinear optimization problem. We demonstrate GHNs implementing the three most important optimization algorithms, namely the Augmented Lagrangian, Generalized Reduced Gradient and Successive Quadratic Programming methods. The study results in a dynamic view of the optimization problem and offers a straightforward model for the parallelization of the optimization computations, thus significantly extending the practical limits of problems that can be formulated as an optimization problem and which can gain from the introduction of nonlinearities in their structure (eg. pattern recognition, supervised learning, design of content-addressable memories).

1 To whom correspondence should be addressed.

356

Reklaitis, Tsirukis and Tenorio

1 RELATED WORK The ability of networks of highly interconnected simple nonlinear analog processors (neurons) to solve complicated optimization problems was demonstrated in a series of papers by Hopfield and Tank (Hopfield, 1984), (Tank, 1986). The Hopfield computational model is almost exclusively applied to the solution of combinatorially complex linear decision problems (eg. Traveling Salesman Problem). Unfortunately such problems can not be solved with guaranteed quality, (Bruck, 1987), getting trapped in locally optimal solutions. Jeffrey and Rossner, (Jeffrey, 1986), extended Hopfield's technique to the nonlinear unconstrained optimization problem, using Cauchy dynamics. Kennedy and Chua, (Kennedy, 1988), presented an analog implementation of a network solving a nonlinear optimization problem. The underlying optimization algorithm is a simple transformation method, (Reklaitis, 1983), which is known to be relatively inefficient for large nonlinear optimization problems.

2 LINEAR HOPFIELD NETWORK (LHN) The computation in a Hopfield network is done by a collection of highly interconnected simple neurons. Each processing element, i, is characterized by the activation level, Ui, which is a function of the input received from the external environment, Ii, and the state of the other neurons. The activation level of i is transmitted to the other processors, after passing through a filter that converts Ui to a 0-1 binary value, Vi' The time behavior of the system is described by the following model:

~ T·V· - -' + I· ' ~ 'J J J