The Neural Networks Realization of a Gradient Method with a Convolution Integral by Using Neurons with “Memory” (Inertial Neurons)

Bibliographic Information

Other Title
  • “メモリー”付きニューロン (慣性系ニューロン) を用いた畳み込み積分型勾配法のニューラルネットワーク実現
  • メモリー ツキ ニューロン カンセイケイ ニューロン オ モチイタ タタミコミ

Search this article

Description

First, a new type of models for trajectory methods to solve optimization problems is considered. In this new model, a velocity of the trajectory is given by a convolution integral form with all gradients of the minimization function on the trajectory for the past time. The new trajectory method can be called a gradient method with the optimizer's “memory” with respect to the past gradient information, and the model can be transformed into the second order differential equation model whose trajectory tides over trapping into local optima under a suitable initial velocity. <BR>Next, in order to solve quadratic programming problems with variables constrained on the closed interval [0, 1] 's, the gradient method with “memory” is realized by neural networks as operational circuits composed of neurons, each of which has two integral elements. The trajectory by realized neural networks has possibility to overcome trapping into local minima, while the Hopfield type with first order differential equation model traps into them. <BR>Last, the numerical simulation results for simple test problems demonstrate properties of these presented neural networks.

Journal

Citations (3)*help

See more

References(8)*help

See more

Details 詳細情報について

Report a problem

Back to top