TY - JOUR
T1 - Wireless Federated Learning with Limited Communication and Differential Privacy
AU - Sonee, Amir
AU - Rini, Stefano
AU - Huang, Yu Chih
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - This paper investigates the role of dimensionality reduction in efficient communication and differential privacy (DP) of the local datasets at the remote users for over-the-air computation (AirComp)-based federated learning (FL) model. More precisely, we consider the FL setting in which clients are prompted to train a machine learning model by simultaneous channel-aware and limited communications with a parameter server (PS) over a Gaussian multiple-access channel (GMAC), so that transmissions sum coherently at the PS globally aware of the channel coefficients. For this setting, an algorithm is proposed based on applying (i) federated stochastic gradient descent (FedSGD) for training the minimum of a given loss function based on the local gradients, (ii) Johnson-Lindenstrauss (JL) random projection for reducing the dimension of the local updates and (iii) artificial noise to further aid user's privacy. For this scheme, our results show that the local DP (LDP) performance is mainly improved due to injecting noise of greater variance on each dimension while keeping the sensitivity of the projected vectors unchanged. This is while the convergence rate is slowed down compared to the case without dimensionality reduction. As the performance outweighs for the slower convergence, the trade-off between privacy and convergence is higher but is shown to lessen in high-dimensional regime yielding almost the same trade-off with much less communication cost.
AB - This paper investigates the role of dimensionality reduction in efficient communication and differential privacy (DP) of the local datasets at the remote users for over-the-air computation (AirComp)-based federated learning (FL) model. More precisely, we consider the FL setting in which clients are prompted to train a machine learning model by simultaneous channel-aware and limited communications with a parameter server (PS) over a Gaussian multiple-access channel (GMAC), so that transmissions sum coherently at the PS globally aware of the channel coefficients. For this setting, an algorithm is proposed based on applying (i) federated stochastic gradient descent (FedSGD) for training the minimum of a given loss function based on the local gradients, (ii) Johnson-Lindenstrauss (JL) random projection for reducing the dimension of the local updates and (iii) artificial noise to further aid user's privacy. For this scheme, our results show that the local DP (LDP) performance is mainly improved due to injecting noise of greater variance on each dimension while keeping the sensitivity of the projected vectors unchanged. This is while the convergence rate is slowed down compared to the case without dimensionality reduction. As the performance outweighs for the slower convergence, the trade-off between privacy and convergence is higher but is shown to lessen in high-dimensional regime yielding almost the same trade-off with much less communication cost.
KW - Federated edge learning
KW - Local differential pri-vacy
KW - Over-the-air-computation
KW - Random projection
UR - http://www.scopus.com/inward/record.url?scp=85184380268&partnerID=8YFLogxK
U2 - 10.1109/GLOBECOM46510.2021.9685320
DO - 10.1109/GLOBECOM46510.2021.9685320
M3 - Conference article
AN - SCOPUS:85184380268
SN - 2334-0983
JO - Proceedings - IEEE Global Communications Conference, GLOBECOM
JF - Proceedings - IEEE Global Communications Conference, GLOBECOM
T2 - 2021 IEEE Global Communications Conference, GLOBECOM 2021
Y2 - 7 December 2021 through 11 December 2021
ER -