TY - JOUR
T1 - Secure federated learning with efficient communication in vehicle network
AU - Li, Yinglong
AU - Zhang, Zhenjiang
AU - Zhang, Zhiyuan
AU - Kao, Yi Chih
N1 - Publisher Copyright:
© 2020 Taiwan Academic Network Management Committee. All rights reserved.
PY - 2020
Y1 - 2020
N2 - Internet of Vehicles (IoV) generates large amounts of data at the network edge. Machine learning models are often built on these data, to enable the detection, classification, and prediction of traffic events. Due to network bandwidth, storage, and especially privacy concerns, it is often impossible to send all the IoV data to the edge server for centralized model training. Federated learning is a promising paradigm for distributed machine learning, which enables edge nodes to train models locally. As vehicle usually has unreliable and relatively slow network connection, reducing the communication overhead is importance. In this paper, we propose a secure federated learning with efficient communication (SFLEC) scheme in vehicle network. To protect the privacy of local update, we upload the updated parameters of the model with local differential privacy. We further propose a client selection approach that identifies relevant updates trained by vehicles and prevents irrelevant updates from being uploaded for reduced network footprint to achieve efficient communication. Then we prove the loss function of the trained FL in our scheme exits a theoretical convergence. Finally, we evaluate our scheme on two datasets and compare with basic FL. Our proposed scheme improves the communication efficiency, while preserves the data privacy.
AB - Internet of Vehicles (IoV) generates large amounts of data at the network edge. Machine learning models are often built on these data, to enable the detection, classification, and prediction of traffic events. Due to network bandwidth, storage, and especially privacy concerns, it is often impossible to send all the IoV data to the edge server for centralized model training. Federated learning is a promising paradigm for distributed machine learning, which enables edge nodes to train models locally. As vehicle usually has unreliable and relatively slow network connection, reducing the communication overhead is importance. In this paper, we propose a secure federated learning with efficient communication (SFLEC) scheme in vehicle network. To protect the privacy of local update, we upload the updated parameters of the model with local differential privacy. We further propose a client selection approach that identifies relevant updates trained by vehicles and prevents irrelevant updates from being uploaded for reduced network footprint to achieve efficient communication. Then we prove the loss function of the trained FL in our scheme exits a theoretical convergence. Finally, we evaluate our scheme on two datasets and compare with basic FL. Our proposed scheme improves the communication efficiency, while preserves the data privacy.
KW - Client selection
KW - Edge computing
KW - Federated learning
KW - Privacy preservation
UR - http://www.scopus.com/inward/record.url?scp=85107229053&partnerID=8YFLogxK
U2 - 10.3966/160792642020122107022
DO - 10.3966/160792642020122107022
M3 - Article
AN - SCOPUS:85107229053
SN - 1607-9264
VL - 21
SP - 2075
EP - 2084
JO - Journal of Internet Technology
JF - Journal of Internet Technology
IS - 7
ER -