TY - GEN
T1 - HyperFed
T2 - 10th International Conference on Dependable Systems and Their Applications, DSA 2023
AU - Nuannimnoi, Sirapop
AU - Delizy, Florian
AU - Huang, Ching Yao
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Traditional machine learning solutions rely on Cloud based services, which could potentially lead to major problems including security, privacy data leakage, unacceptable latency, and excessive operating expenses. Federated Learning techniques (FL) were introduced to tackle these challenges by allowing distributed edge nodes/servers to collaboratively train AI models without sharing raw training data. However, some of the nodes may intentionally or unintentionally upload virtual (fake) models to the main server. This behavior is called "Free-riding", and it could potentially have a negative effect on the overall performance of the FL system. In this paper, we propose a new adaptive contribution-based aggregation technique using hypernetworks, namely "HyperFed", and evaluate it on two important aspects: resistance against free-riders' fake contributions, and average convergence speed of global model on local datasets. Our simulation results on Federated EMNIST dataset display promising performance in comparison to FedAvg and AdaFed aggregation techniques.
AB - Traditional machine learning solutions rely on Cloud based services, which could potentially lead to major problems including security, privacy data leakage, unacceptable latency, and excessive operating expenses. Federated Learning techniques (FL) were introduced to tackle these challenges by allowing distributed edge nodes/servers to collaboratively train AI models without sharing raw training data. However, some of the nodes may intentionally or unintentionally upload virtual (fake) models to the main server. This behavior is called "Free-riding", and it could potentially have a negative effect on the overall performance of the FL system. In this paper, we propose a new adaptive contribution-based aggregation technique using hypernetworks, namely "HyperFed", and evaluate it on two important aspects: resistance against free-riders' fake contributions, and average convergence speed of global model on local datasets. Our simulation results on Federated EMNIST dataset display promising performance in comparison to FedAvg and AdaFed aggregation techniques.
KW - Edge Computing
KW - Federated Learning
KW - Free-riding
KW - Hypernetworks
KW - Model aggregation
KW - Reputation Mechanism
UR - http://www.scopus.com/inward/record.url?scp=85179509139&partnerID=8YFLogxK
U2 - 10.1109/DSA59317.2023.00025
DO - 10.1109/DSA59317.2023.00025
M3 - Conference contribution
AN - SCOPUS:85179509139
T3 - Proceedings - 2023 10th International Conference on Dependable Systems and Their Applications, DSA 2023
SP - 126
EP - 134
BT - Proceedings - 2023 10th International Conference on Dependable Systems and Their Applications, DSA 2023
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 10 August 2023 through 11 August 2023
ER -