2022 journal article

Communication-Efficient Federated Learning via Predictive Coding

IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 16(3), 369–380.

By: K. Yue, R. Jin, C. Wong & H. Dai

author keywords: Predictive models; Servers; Collaborative work; Predictive coding; Entropy coding; Costs; Quantization (signal); Federated learning; distributed optimization; predictive coding
Source: Web Of Science
Added: May 31, 2022

Federated learning can enable remote workers to collaboratively train a shared machine learning model while allowing training data to be kept locally. In the use case of wireless mobile devices, the communication overhead is a critical bottleneck due to limited power and bandwidth. Prior work has utilized various data compression tools such as quantization and sparsification to reduce the overhead. In this paper, we propose a predictive coding based communication scheme for federated learning. The scheme has shared prediction functions among all devices and allows each worker to transmit a compressed residual vector derived from the reference. In each communication round, we select the predictor and quantizer based on the rate distortion cost, and further reduce redundancy by using the entropy coding. Extensive simulations reveal that the communication cost can be reduced up to 99\% with better learning performance when compared with other baselines.