拡張型確率的コンプレキシティと情報論的学習理論

書誌事項

タイトル別名
  • Extended Stochastic Complexity and Informatical Learning Theory
  • カクチョウガタ カクリツテキ コンプレキシティ ト ジョウホウロンテキ ガクシ

この論文をさがす

抄録

Rissanen has introduced stochastic complexity to define the amount of information in a given data sequence relative to a given hypothesis class of probability densities, where the information is measured in terms of a logarithmic loss associated with universal data compression. This paper proposes the notion of extended stochastic complerxity (ESC) by generalizing Rissanen's stochastic complexity into the decision-theoretic setting where a general real-valued function is used as a hypothesis and a general loss function is used as a distortion measure. We thereby demonstrate the effectiveness of ESC in design and analysis of learning algorithms in sequential prediction and function-estimation scenarios. As an application of ESC to sequential prediction, this paper shows that a sequential realization of ESC produces a sequential prediction algorithm called the aggregating strategy, for which the worst-case cumulative prediction loss is asymptotically minimal. As an application of ESC to function-estimation, this paper shows that a batch-approximation of ESC induces a batchlearning algorithm called the minimum L-complexity algorithm (MLC), for which an upper bound on the statistical risk is least to date. Through ESC we give a unifying view of designing the most effective learning algorithms in fundamental learning issues.

収録刊行物

  • 応用数理

    応用数理 8 (3), 188-203, 1998

    一般社団法人 日本応用数理学会

被引用文献 (2)*注記

もっと見る

参考文献 (20)*注記

もっと見る

詳細情報 詳細情報について

問題の指摘

ページトップへ