-
- Tsay Ruey S.
- Graduate School of Business, University of Chicago
-
- Ando Tomohiro
- Graduate School of Business Administration, Keio University
この論文をさがす
抄録
In this paper we use the penalized maximum likelihood and information criteria to propose a new boosting algorithm for various statistical models, including linear regression, generalized linear, and multi-class classification models. In contrast to previous studies, where the empirical goodness-of-fit measures were often used for model updating, information criteria, as a predictive measure of a model, are employed to select a model in each iteration of the proposed algorithms. In addition, the proposed algorithms select the smoothing parameter in each iteration whereas previous methods fixed the parameter for all iterations.<BR>We show that the penalized maximum likelihood L2 boosting is consistent for high-dimensional linear models under the conditions that (a) the true underlying regression function is sparse and (b) the number of predictor variables is allowed to grow exponentially. We then demonstrate the proposed boosting algorithms using both simulated and real data. Comparison with some existing methods shows that the proposed boosting algorithms work well.
収録刊行物
-
- 応用統計学
-
応用統計学 38 (2), 41-67, 2009
応用統計学会
- Tweet
詳細情報 詳細情報について
-
- CRID
- 1390282679417908096
-
- NII論文ID
- 10026048972
-
- NII書誌ID
- AN00330942
-
- ISSN
- 18838081
- 02850370
-
- NDL書誌ID
- 10414197
-
- 本文言語コード
- en
-
- データソース種別
-
- JaLC
- NDL
- Crossref
- CiNii Articles
-
- 抄録ライセンスフラグ
- 使用不可