TY - GEN
T1 - Heterogeneous Federated Learning Through Multi-Branch Network
AU - Wang, Ching Hao
AU - Huang, Kang Yang
AU - Chen, Jun Cheng
AU - Shuai, Hong-Han
AU - Cheng, Wen-Huang
N1 - Publisher Copyright:
© 2021 IEEE
PY - 2021
Y1 - 2021
N2 - Recently, federated learning has gained increasing attention for privacy-preserving computation since the learning paradigm allows to train models without the need for exchanging the data across different institutions distributively. However, heterogeneity of computational capabilities of edge devices is seldom discussed and analyzed in the current literature for heterogeneous federated learning. To address this issue, we propose a novel heterogeneous federated learning framework based on multi-branch deep neural network models which enable the selection of a proper sub-branch model for the client devices according to their computational capabilities. Meanwhile, we also present an aggregation method for model training, MFedAvg, that performs branch-wise averaging-based aggregation. With extensive experiments on MNIST, FashionMNIST, MedMNIST, and CIFAR-10, it demonstrates that our proposed approaches can achieve satisfactory performance with guaranteed convergence and effectively utilize all the available resources for training across different devices with lower communication cost than its homogeneous counterpart.
AB - Recently, federated learning has gained increasing attention for privacy-preserving computation since the learning paradigm allows to train models without the need for exchanging the data across different institutions distributively. However, heterogeneity of computational capabilities of edge devices is seldom discussed and analyzed in the current literature for heterogeneous federated learning. To address this issue, we propose a novel heterogeneous federated learning framework based on multi-branch deep neural network models which enable the selection of a proper sub-branch model for the client devices according to their computational capabilities. Meanwhile, we also present an aggregation method for model training, MFedAvg, that performs branch-wise averaging-based aggregation. With extensive experiments on MNIST, FashionMNIST, MedMNIST, and CIFAR-10, it demonstrates that our proposed approaches can achieve satisfactory performance with guaranteed convergence and effectively utilize all the available resources for training across different devices with lower communication cost than its homogeneous counterpart.
KW - Federated Learning
KW - Heterogeneous
UR - http://www.scopus.com/inward/record.url?scp=85126428378&partnerID=8YFLogxK
U2 - 10.1109/ICME51207.2021.9428189
DO - 10.1109/ICME51207.2021.9428189
M3 - Conference contribution
AN - SCOPUS:85126428378
T3 - Proceedings - IEEE International Conference on Multimedia and Expo
BT - 2021 IEEE International Conference on Multimedia and Expo, ICME 2021
PB - IEEE Computer Society
T2 - 2021 IEEE International Conference on Multimedia and Expo, ICME 2021
Y2 - 5 July 2021 through 9 July 2021
ER -