On the statistical mechanics of (un)constrained stochastic Hopfield and 'elastic' neural networks
Stochastic binary Hopfield models are viewed from the angle of statistical mechanics. After an analysis of the unconstrained model using mean field theory, a similar investigation is applied to a constrained model yielding comparable general explicit formulas of the free energy. Conditions are given for which some of the free energy expressions are Lyapunov functions of the corresponding differential equations. Both stochastic models appear to coincide with a specific continuous model. Physically, the models are related to spin and Potts glass models. Also, a `complementary' free energy function of both the unconstrained and the constrained model is derived. The analysis culminates in a very general framework for analyzing constrained and unconstrained Hopfield neural networks: the stationary points of the corresponding free energy appears to coincide exactly with the set of equilibrium conditions of the corresponding continuous Hopfield neural network. Moreover, the relationship with `elastic net' algorithms is analyzed: it is proved that this class of algorithms cannot be derived from the theory of statistical mechanics (as sometimes is supposed), but should be considered as a special `penalty method', namely as one with dynamical penalty weights. We mention some experimental results and discuss implications for the use of the various models in resolving constrained optimization problems.