Twofold Structure of Duality in Bayesian Model Averaging
-
- Ohnishi Toshio
- Faculty of Economics, Kyushu University
-
- Yanagimoto Takemi
- Department of Industrial and Systems Engineering, Chuo University
Search this article
Abstract
Two Bayesian prediction problems in the context of model averaging are investigated by adopting dual Kullback-Leibler divergence losses, the e-divergence and the m-divergence losses. We show that the optimal predictors under the two losses are shown to satisfy interesting saddlepoint-type equalities. Actually, the optimal predictor under the e-divergence loss balances the log-likelihood ratio and the loss, while the optimal predictor under the m-divergence loss balances the Shannon entropy difference and the loss. These equalities also hold for the predictors maximizing the log-likelihood and the Shannon entropy respectively under the e-divergence loss and the m-divergence loss, showing that enlarging the log-likelihood and the Shannon entropy moderately will lead to the optimal predictors. In each divergence loss case we derive a robust predictor in the sense that its posterior risk is constant by minimizing a certain convex function. The Legendre transformation induced by this convex function implies that there is inherent duality in each Bayesian prediction problem.
Journal
-
- JOURNAL OF THE JAPAN STATISTICAL SOCIETY
-
JOURNAL OF THE JAPAN STATISTICAL SOCIETY 43 (1), 29-55, 2013
THE JAPAN STATISTICAL SOCIETY
- Tweet
Keywords
Details 詳細情報について
-
- CRID
- 1390001205286530688
-
- NII Article ID
- 10031185799
-
- NII Book ID
- AA1105098X
-
- ISSN
- 13486365
- 18822754
-
- MRID
- 3154717
-
- NDL BIB ID
- 024763350
-
- Text Lang
- en
-
- Data Source
-
- JaLC
- NDL
- Crossref
- CiNii Articles
- KAKEN
-
- Abstract License Flag
- Disallowed