-
- Ivarsson Patrik
- Department of Computer Science and Intelligent Systems, Osaka Prefecture University
-
- Nojima Yusuke
- Department of Computer Science and Intelligent Systems, Osaka Prefecture University
-
- Ishibuchi Hisao
- Department of Computer Science and Intelligent Systems, Osaka Prefecture University
Search this article
Abstract
GAssist is a high-performing machine learning algorithm, which obtains a classifier through genetic learning. In this paper we examine the effects of implementing GAssist as a parallel distributed model. In our parallel distributed model, the population of individuals is divided into multiple subpopulations. Training data are also divided into multiple subsets, which are subsequently assigned one per subpopulation. In each subpopulation, a genetic algorithm is performed separately from the other subpopulations. Additionally, we rotate the training data subset used for each subpopulation periodically. In doing this, we avoid over-fitting of each subpopulation to the local training data subset. Through computational experiments we examine the effectiveness of our parallel distributed model with respect to its search ability, generalization ability and computation time.
Journal
-
- Proceedings of the Fuzzy System Symposium
-
Proceedings of the Fuzzy System Symposium 29 (0), 97-97, 2013
Japan Society for Fuzzy Theory and Intelligent Informatics
- Tweet
Details 詳細情報について
-
- CRID
- 1390282680648346112
-
- NII Article ID
- 130005480412
- 40019824968
-
- NII Book ID
- AA12165648
-
- ISSN
- 18820212
-
- NDL BIB ID
- 024926399
-
- Text Lang
- ja
-
- Data Source
-
- JaLC
- NDL
- CiNii Articles
-
- Abstract License Flag
- Disallowed