2022 journal article
Communication-Efficient Federated Learning via Predictive Coding
IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 16(3), 369–380.
Federated learning can enable remote workers to collaboratively train a shared machine learning model while allowing training data to be kept locally. In the use case of wireless mobile devices, the communication overhead is a critical bottleneck due to limited power and bandwidth. Prior work has utilized various data compression tools such as quantization and sparsification to reduce the overhead. In this paper, we propose a predictive coding based communication scheme for federated learning. The scheme has shared prediction functions among all devices and allows each worker to transmit a compressed residual vector derived from the reference. In each communication round, we select the predictor and quantizer based on the rate distortion cost, and further reduce redundancy by using the entropy coding. Extensive simulations reveal that the communication cost can be reduced up to 99\% with better learning performance when compared with other baselines.