High-Dimensional Nonlinear Feature Selection with Hilbert-Schmidt Independence Criterion Lasso

Bibliographic Information

Other Title
  • Hilbert-Schmidt Independence Criterion Lasso法に基づいた高次元非線形特徴選択
  • Hilbert-Schmidt Independence Criterion Lasso ホウ ニ モトズイタ コウジゲン ヒセンケイ トクチョウ センタク

Search this article

Abstract

<p>Variable selection is a significant research topic in the statistics, machine learning and data mining communities. In statistics, statistical methods based on sparse modeling and sure independence screening (SIS) are major research topics for feature selection problems. However, most of the feature selection methods developed in the machine learning community lack of theoretical guarantees. Hence, these feature selection methods have been overlooked by the statistics community, despite their good prediction accuracy usually obtained in real/simulated experiments. In this paper, we introduce the so-called Hilbert-Schmidt Independence Criterion Lasso (HSIC Lasso), a feature selection method widely used among the machine learning and data mining communities. First, we introduce the HSIC Lasso as a feature selection method and derive the related convex optimization problem. Then, we describe the Block HSIC Lasso procedure together with the related selective inference framework. Furthermore, we show that the HSIC Lasso is closely related to the nonnegative Lasso and the HSIC-based SIS. Finally, we provide some large sample properties of the HSIC Lasso.</p>

Journal

Details 詳細情報について

Report a problem

Back to top