適応型モデル選択基準に基づく非線形回帰モデルの推定

書誌事項

タイトル別名
  • Non-parametric Regression Model Estimation Based on an Adaptive Model Selection Criterion
  • テキオウガタ モデル センタク キジュン ニ モトヅク ヒセンケイ カイキ モデル ノ スイテイ

この論文をさがす

抄録

In modeling procedures for non-parametric regression, model selection via optimization of the model tuning parameters becomes an important problem. This article discusses optimal model selection with a data-adaptive penalty, and optimization using the evolutionary algorithm (EA). Most model selection criteria, like AIC, BIC and Mallows's Cp, utilize a fixed penalty to control the complexity of the models. These model selection procedures are non-adaptive and their performance depends on the required complexity of the model. For instance, model selection criteria with "large" penalty perform well only for "simple" models and vice versa. To avoid selection bias of this kind, we adopt the adaptive model selection criterion (AMSC) with a data-adaptive penalty, which is defined as the best estimator of the relative squared loss between the true model and the estimator, and is applicable to various degree of model structure complexity. In general, optimization for model selection is a complex non-linear problem. In most conventional methodologies, deterministic procedures, like grid search or the stepwise technique, were adopted and only suboptimal solutions were obtained. In this paper, we consider global optimization of the AMSC via stochastic optimization procedures (EA, for example). The proposed procedure is applied to 1) a hierarchical neural network, i) a single layer feed-forward neural network, ii) a radial basis functions network, and 2) a Support Vector Machine, and in modeling procedures for non-parametric regression, model selection via optimization of the model tuning parameters becomes an important problem. This article discusses optimal model selection with a data-adaptive penalty, and optimization using the evolutionary algorithm (EA). Most model selection criteria, like AIC, BIC and Mallows's Cp, utilize a fixed penalty to control the complexity of the models. These model selection procedures are non-adaptive and their performance depends on the required complexity of the model. For instance, model selection criteria with "large" penalty perform well only for "simple" models and vice versa. To avoid selection bias of this kind, we adopt the adaptive model selection criterion (AMSC) with a data-adaptive penalty, which is defined as the best estimator of the relative squared loss between the true model and the estimator, and is applicable to various degree of model structure complexity. In general, optimization for model selection is a complex non-linear problem. In most conventional methodologies, deterministic procedures, like grid search or the stepwise technique, were adopted and only sub-optimal solutions were obtained. In this paper, we consider global optimization of the AMSC via stochastic optimization procedures (EA, for example). The proposed procedure is applied to 1) a hierarchical neural network, i) a single layer feed-forward neural network, ii) a radial basis functions network, and 2) a Support Vector Machine, and its effectiveness is confirmed via simulation studies.

収録刊行物

被引用文献 (1)*注記

もっと見る

参考文献 (21)*注記

もっと見る

関連プロジェクト

もっと見る

詳細情報 詳細情報について

問題の指摘

ページトップへ