Stochastic Discrete First-Order Algorithm for Feature Subset Selection
-
- KUDO Kota
- Graduate School of Science and Technology, University of Tsukuba
-
- TAKANO Yuichi
- Faculty of Engineering, Information and Systems, University of Tsukuba
-
- NOMURA Ryo
- Center for Data Science, Waseda University
抄録
<p>This paper addresses the problem of selecting a significant subset of candidate features to use for multiple linear regression. Bertsimas et al. [5] recently proposed the discrete first-order (DFO) algorithm to efficiently find near-optimal solutions to this problem. However, this algorithm is unable to escape from locally optimal solutions. To resolve this, we propose a stochastic discrete first-order (SDFO) algorithm for feature subset selection. In this algorithm, random perturbations are added to a sequence of candidate solutions as a means to escape from locally optimal solutions, which broadens the range of discoverable solutions. Moreover, we derive the optimal step size in the gradient-descent direction to accelerate convergence of the algorithm. We also make effective use of the L2-regularization term to improve the predictive performance of a resultant subset regression model. The simulation results demonstrate that our algorithm substantially outperforms the original DFO algorithm. Our algorithm was superior in predictive performance to lasso and forward stepwise selection as well.</p>
収録刊行物
-
- IEICE Transactions on Information and Systems
-
IEICE Transactions on Information and Systems E103.D (7), 1693-1702, 2020-07-01
一般社団法人 電子情報通信学会
- Tweet
詳細情報
-
- CRID
- 1390285300171471616
-
- NII論文ID
- 130007867764
-
- ISSN
- 17451361
- 09168532
-
- 本文言語コード
- en
-
- データソース種別
-
- JaLC
- Crossref
- CiNii Articles
- KAKEN
-
- 抄録ライセンスフラグ
- 使用不可