Unsupervised Federated Learning for Unbalanced Data

Mykola Servetnyk, Carrson C. Fung, Zhu Han

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

13 Scopus citations


This work considers unsupervised learning tasks being implemented within the federated learning framework to satisfy stringent requirements for low-latency and privacy of the emerging applications. The proposed algorithm is based on Dual Averaging (DA), where the gradients of each agent are aggregated at a central node. While having its advantages in terms of distributed computation, the accuracy of federated learning training reduces significantly when the data is nonuniformly distributed across devices. Therefore, this work proposes two weight computation algorithms, with one using a fixed size bin and the other with self-organizing maps (SOM) that solves the underlying dimensionality problem inherent in the first method. Simulation results are also provided to show that the proposed algorithms' performance is comparable to the scenario in which all data is uploaded and processed in the centralized cloud.

Original languageAmerican English
Title of host publication2020 IEEE Global Communications Conference, GLOBECOM 2020 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728182988
StatePublished - Dec 2020
Event2020 IEEE Global Communications Conference, GLOBECOM 2020 - Virtual, Taipei, Taiwan
Duration: 7 Dec 202011 Dec 2020

Publication series

Name2020 IEEE Global Communications Conference, GLOBECOM 2020 - Proceedings


Conference2020 IEEE Global Communications Conference, GLOBECOM 2020
CityVirtual, Taipei


  • distributed optimization
  • dual averaging algorithm
  • Federated learning
  • gradient weighting
  • self-organizing maps
  • unsupervised learning


Dive into the research topics of 'Unsupervised Federated Learning for Unbalanced Data'. Together they form a unique fingerprint.

Cite this