Projection learning of the minimum variance type

説明

Proposes a new learning method for supervised learning, named minimum variance projection learning (MVPL). Due to noise in the training examples, the resultant functions are not uniquely determined in general, and are distributed around a function obtained from noiseless training examples. The smaller the variance of the distribution, the more stable results that can be obtained. MVPL is a learning method which, in a family of projection learnings, minimizes the variance of the distribution. We clarify the properties of MVPL and illustrate its effectiveness by computer simulation.

収録刊行物

詳細情報 詳細情報について

問題の指摘

ページトップへ