Training of Deep Generative Models Using Several Loss Functions and its Application to Constrained Black-Box Optimization

DOI

Bibliographic Information

Other Title
  • 複数の損失関数を用いた深層生成モデルの訓練と制約付きブラックボックス最適化への適用

Abstract

<p>In constrained black-box optimization, optimizing the objective function is extremely difficult if the feasible domain X is a set of discrete feasible regions and even obtaining a feasible solution is difficult. This paper proposes a technique to transform the search space S into a simple one with almost no constraints. In detail, we create a map from the input space Z to X, Decoder G: Z -> X, and use Z of G as the search space to achieve the above transformation. To perform mapping to discrete regions, we make Decoder G concatenated small neural network models (NNs) with a shortcut connection, and we define loss functions for each NN. This prevents mode collapse, which is a well-known problem in deep generative models. In the experiments, we demonstrate the usefulness of the proposed technique using a test problem where the volume ratio of X to S is less than 1e-7.</p>

Journal

Details 詳細情報について

Report a problem

Back to top