書誌事項
- タイトル別名
-
- Federated Learning Enhanced by Continual Learning for Common and Uncommon Features
抄録
<p>Federated learning is a promising machine learning technique that enables multiple clients to collaboratively build a model without revealing the raw data to each other. Among various types of federated learning methods, horizontal federated learning (HFL) is the best-studied category and handles homogeneous feature spaces. However, in the case of heterogeneous feature spaces, HFL uses only common features and leaves client-specific features unutilized. In this paper, we propose a HFL method using neural networks named Federated Learning Enhanced by Continual learning for common and uncommon features (FLEC), which improves the performance of HFL by taking advantage of unique features of each client via a continual learning approach. FLEC splits the whole network into two networks corresponding to common features and unique features, respectively. It jointly trains the first network by using common features through vanilla HFL and locally trains the second network by using unique features and leveraging the knowledge of the first one via lateral connections without interfering with the federated training of it. We conduct experiments on various real world datasets and show that FLEC greatly outperforms several baselines such as a vanilla HFL that only uses common features, a local learning method that uses all features each client has, and a missing data imputation method that fills in the features each client does not have with zeros or averages.</p>
収録刊行物
-
- 人工知能学会論文誌
-
人工知能学会論文誌 39 (3), A-N72_1-11, 2024-05-01
一般社団法人 人工知能学会
- Tweet
詳細情報 詳細情報について
-
- CRID
- 1390299993933119744
-
- ISSN
- 13468030
- 13460714
-
- 本文言語コード
- ja
-
- データソース種別
-
- JaLC
- Crossref
-
- 抄録ライセンスフラグ
- 使用不可