微分情報を用いたランダム探索最適化手法

書誌事項

タイトル別名
  • New Random Search Method for Global Optimization by Using Gradient Information
  • Likelihood Search Method (L.S.M.)
  • Likelihood Search Method (L.S.M.)

抄録

In this paper, a new optimization method named Likelihood Search Method (L.S.M.) is proposed for searching for a global optimum systematically and effectively in a single framework, which is not a combination of different methods. The L.S.M. can be applied to differentiable objective functions. The L.S.M. can realize the intensification and diversification of the search based on an idea that the searching for variables is intensified where a likelihood of finding good solutions is high, on the other hand, the searching for the variables is diversified where the likelihood is low. The L.S.M. is basically Random Search Method (R.S.M.) but utilizes gradient information. The likelihood of finding good solutions is defined by the norm of gradient. In case that the norm of gradient is large, this means the likelihood of finding good solutions is high, that is, it is very likely that a better solution exists in the vicinity of the current solution. On the other hand, In case that the norm of gradient is small, this means the likelihood is low, that is, we do not expect that a better solution is near the current solution. Therefore, in case of high likelihood, the search is done within a short range in the opposite direction of the gradient vector. So, intensified search can be realized. On the other hand, in case of low likelihood, the search does not much depend on the gradient; the range is wide, and the direction includes even those which make the objective function worse. So, diversified search can be realized. From the simulation results of minimizing a complicated multi variable nonlinear function and controlling a nonlinear crane system, it has been clarified that the L.S.M. is superior to both the Gradient Method and the R.S.M..

収録刊行物

詳細情報 詳細情報について

問題の指摘

ページトップへ