HiQuFlexAsync: Hierarchical Federated Learning with Quantization, Flexible Client Selection and Asynchronous Communication

Authors: Ze Zhao, Yifan Liu, Donglin Pan, Yi Liu, Zhenpeng Liu
Conference: ICIC 2024 Posters, Tianjin, China, August 5-8, 2024
Pages: 297-314
Keywords: Federated Learning, Hierarchical Mechanism, Quantization, Client Selection, Asynchronous Aggregation.

Abstract

In three-tier federated learning at the cloud-edge, uneven data distribution can lead to a decrease in model performance, while the increased communication demands of multi-tier federated learning may impact system efficiency. To mitigate the impact of these challenges on federated learning performance, a novel method named HiQuFlexAsync has been introduced. HiQuFlexAsync is an innovative asynchronous three-tier federated learning approach with quantization capabilities. Within the framework of HiQuFlexAsync, a new quantizer is employed to naturally compress local and edge gradients, and an algorithm called "Cost-Optimized Heterogeneous Client-Edge Association" (COHEA) is developed. This algorithm aims to optimize the client selection process for federated learning by leveraging data heterogeneity and the physical diversity of clients. Simulation experiments on the MNIST and CIFAR-10 datasets demonstrate that compared to the traditional three-tier architecture HierFAVG, HiQuFlexAsync achieves an approximate 5.6% increase in accuracy and a 12.2% enhancement in efficiency.
📄 View Full Paper (PDF) 📋 Show Citation