Federated Learning Enhanced by Continual Learning for Common and Uncommon Features

Bibliographic Information

Other Title
  • 重複特徴量と非重複特徴量に対する継続学習による連合学習の改善

Abstract

<p>Federated learning is a promising machine learning technique that enables multiple clients to collaboratively build a model without revealing the raw data to each other. Among various types of federated learning methods, horizontal federated learning (HFL) is the best-studied category and handles homogeneous feature spaces. However, in the case of heterogeneous feature spaces, HFL uses only common features and leaves client-specific features unutilized. In this paper, we propose a HFL method using neural networks named Federated Learning Enhanced by Continual learning for common and uncommon features (FLEC), which improves the performance of HFL by taking advantage of unique features of each client via a continual learning approach. FLEC splits the whole network into two networks corresponding to common features and unique features, respectively. It jointly trains the first network by using common features through vanilla HFL and locally trains the second network by using unique features and leveraging the knowledge of the first one via lateral connections without interfering with the federated training of it. We conduct experiments on various real world datasets and show that FLEC greatly outperforms several baselines such as a vanilla HFL that only uses common features, a local learning method that uses all features each client has, and a missing data imputation method that fills in the features each client does not have with zeros or averages.</p>

Journal

References(29)*help

See more

Details 詳細情報について

Report a problem

Back to top