Bayesian Optimization Based on Meta Learning with Neural Process

DOI

Bibliographic Information

Other Title
  • Neural Process によるメタ学習にもとづくベイズ最適化

Abstract

<p>Bayesian optimization is a technique that optimizes the black box function based on a probabilistic model with a few observation points as possible. In this study, we consider Bayesian optimization in a situation where similar functions other than the target function to be evaluated can be accessed at a low cost. In this paper, we propose BONP using neural processes (NPs), a deep generation model considering the uncertainty of prediction, as a surrogate model. Although NPs can be used for meta-learning, it often ignores given observations and causes under-fitting. To avoid this issue, we also propose a new Dot-CNP that maps observation points to function space and apply it to BONP. In experiments, we dealt with the regression problem with the 1d-synthetic function and the Bayesian optimization problem with the three types of acquisition functions, and demonstrated the effectiveness of the proposed method.</p>

Journal

Details 詳細情報について

  • CRID
    1390003825189419136
  • NII Article ID
    130007856974
  • DOI
    10.11517/pjsai.jsai2020.0_2j1gs202
  • Text Lang
    ja
  • Data Source
    • JaLC
    • CiNii Articles
  • Abstract License Flag
    Disallowed

Report a problem

Back to top