Feature Selection via <i>l</i><sub>1</sub>-Penalized Squared-Loss Mutual Information
-
- JITKRITTUM Wittawat
- Tokyo Institute of Technology
-
- HACHIYA Hirotaka
- Tokyo Institute of Technology
-
- SUGIYAMA Masashi
- Tokyo Institute of Technology
説明
Feature selection is a technique to screen out less important features. Many existing supervised feature selection algorithms use redundancy and relevancy as the main criteria to select features. However, feature interaction, potentially a key characteristic in real-world problems, has not received much attention. As an attempt to take feature interaction into account, we propose l1-LSMI, an l1-regularization based algorithm that maximizes a squared-loss variant of mutual information between selected features and outputs. Numerical results show that l1-LSMI performs well in handling redundancy, detecting non-linear dependency, and considering feature interaction.
収録刊行物
-
- IEICE Transactions on Information and Systems
-
IEICE Transactions on Information and Systems E96.D (7), 1513-1524, 2013
一般社団法人 電子情報通信学会
- Tweet
キーワード
詳細情報 詳細情報について
-
- CRID
- 1390282679355165824
-
- NII論文ID
- 130003370928
-
- ISSN
- 17451361
- 09168532
-
- 本文言語コード
- en
-
- データソース種別
-
- JaLC
- Crossref
- CiNii Articles
-
- 抄録ライセンスフラグ
- 使用不可