Non-parametric Regression Model Estimation Based on an Adaptive Model Selection Criterion

Bibliographic Information

Other Title
  • 適応型モデル選択基準に基づく非線形回帰モデルの推定
  • テキオウガタ モデル センタク キジュン ニ モトヅク ヒセンケイ カイキ モデル ノ スイテイ

Search this article

Abstract

In modeling procedures for non-parametric regression, model selection via optimization of the model tuning parameters becomes an important problem. This article discusses optimal model selection with a data-adaptive penalty, and optimization using the evolutionary algorithm (EA). Most model selection criteria, like AIC, BIC and Mallows's Cp, utilize a fixed penalty to control the complexity of the models. These model selection procedures are non-adaptive and their performance depends on the required complexity of the model. For instance, model selection criteria with "large" penalty perform well only for "simple" models and vice versa. To avoid selection bias of this kind, we adopt the adaptive model selection criterion (AMSC) with a data-adaptive penalty, which is defined as the best estimator of the relative squared loss between the true model and the estimator, and is applicable to various degree of model structure complexity. In general, optimization for model selection is a complex non-linear problem. In most conventional methodologies, deterministic procedures, like grid search or the stepwise technique, were adopted and only suboptimal solutions were obtained. In this paper, we consider global optimization of the AMSC via stochastic optimization procedures (EA, for example). The proposed procedure is applied to 1) a hierarchical neural network, i) a single layer feed-forward neural network, ii) a radial basis functions network, and 2) a Support Vector Machine, and in modeling procedures for non-parametric regression, model selection via optimization of the model tuning parameters becomes an important problem. This article discusses optimal model selection with a data-adaptive penalty, and optimization using the evolutionary algorithm (EA). Most model selection criteria, like AIC, BIC and Mallows's Cp, utilize a fixed penalty to control the complexity of the models. These model selection procedures are non-adaptive and their performance depends on the required complexity of the model. For instance, model selection criteria with "large" penalty perform well only for "simple" models and vice versa. To avoid selection bias of this kind, we adopt the adaptive model selection criterion (AMSC) with a data-adaptive penalty, which is defined as the best estimator of the relative squared loss between the true model and the estimator, and is applicable to various degree of model structure complexity. In general, optimization for model selection is a complex non-linear problem. In most conventional methodologies, deterministic procedures, like grid search or the stepwise technique, were adopted and only sub-optimal solutions were obtained. In this paper, we consider global optimization of the AMSC via stochastic optimization procedures (EA, for example). The proposed procedure is applied to 1) a hierarchical neural network, i) a single layer feed-forward neural network, ii) a radial basis functions network, and 2) a Support Vector Machine, and its effectiveness is confirmed via simulation studies.

Journal

  • Ouyou toukeigaku

    Ouyou toukeigaku 33 (1), 71-91, 2004

    Japanese Society of Applied Statistics

Citations (1)*help

See more

References(21)*help

See more

Related Projects

See more

Details 詳細情報について

Report a problem

Back to top