Learning Bayesian Network Classifiers to Minimize the Number of Class Variable Parameters by Depth-First Branch and Bound Algorithm

DOI

Bibliographic Information

Other Title
  • 深さ優先分枝限定法による目的変数パラメータ数を最小化するベイジアンネットワーク分類器学習

Abstract

Bayesian network classifiers (BNCs) are known to provide accurate classification. Recent studies show that a BNC with the minimum number of the class variable parameters (NCP) provides higher classification accuracy than any other BNCs do. However, a conventional algorithm using a breadth-first search cannot learn BNCs with more than 20 variables. To solve this problem, this study proposes a new algorithm using a depth-first branch and bound algorithm. Specifically, we (1) prove that the NCP of Naive Bayes is a lower bound of that of the BNCs and (2) propose a new heuristic function using the lower bound. The benefits of the proposed method are as follows. (1) The proposed algorithm improves the runtime of existing methods using a breadth-first search. (2) The proposed algorithm finds the best solution so far when stopped early for out of computation resources. Comparison experiments demonstrate that the proposed algorithm can learn a BNC with 58 variables which conventional methods cannot learn.

Journal

Details 詳細情報について

Report a problem

Back to top