|
In this paper, we discuss the effectiveness of two kinds of free energies and four relaxation techniques in annealed neural networks for solving optimization problems. The two kinds of free energies characterize their respective dynamics, mean field annealing (MFA) and TAP (Thouless, Anderson and Plamer) equations. In according to different ways of adaption or interpretation of temperature in annealed neural networks, the dynamics can be relaxed by four corresponding techniques: 1. Gradient descent method searches minimum at a fixed low temperature. 2. Deterministic annealing evaluates mean configuration of annealed neural networks at each temperature with reduction from a high value gradually to a sufficiently low value. 3. Parallel tempering runs different configurations at different temperatures and swaps replicas in a proper probability at thermal equilibrium. 4. Individual modulation employs independent local temperature for each processing element. We use graph bisection problems as benchmarks to evaluate performance of different approaches and come to conclusions.
|