Projection learning of the minimum variance type
説明
Proposes a new learning method for supervised learning, named minimum variance projection learning (MVPL). Due to noise in the training examples, the resultant functions are not uniquely determined in general, and are distributed around a function obtained from noiseless training examples. The smaller the variance of the distribution, the more stable results that can be obtained. MVPL is a learning method which, in a family of projection learnings, minimizes the variance of the distribution. We clarify the properties of MVPL and illustrate its effectiveness by computer simulation.
収録刊行物
-
- ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378)
-
ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378) 3 1172-1177, 2003-01-22
IEEE