Variable Selection in Multivariate Linear Regression Models with Fewer Observations than the Dimension

Search this article

Abstract

This paper deals with the selection of variables in multivariate linear regression models with fewer observations than the dimension by using Akaike's information criterion (AIC). It is well known that the AIC cannot be defined when the dimension of an observation is larger than the sample size, since an ordinary estimator of the covariance matrix becomes singular. By replacing the ordinary estimator of the covariance matrix with its ridge-type estimator, we propose a new AIC for selecting variables of multivariate linear regression models even though the dimension of an observation is larger than the sample size. The bias correction term of AIC is evaluated from a remarkable asymptotic theory based on the dimension and the sample size approaching to ∞ simultaneously. By conducting numerical studies, we verify that our new criteria perform well.

Journal

  • Ouyou toukeigaku

    Ouyou toukeigaku 39 (1), 1-19, 2010

    Japanese Society of Applied Statistics

Citations (2)*help

See more

References(33)*help

See more

Details 詳細情報について

Report a problem

Back to top