FedConv: A Learning-on-Model Paradigm for Heterogeneous Federated Clients

Bibliographic Details
Title: FedConv: A Learning-on-Model Paradigm for Heterogeneous Federated Clients
Authors: Shen, Leming, Yang, Qiang, Cui, Kaiyan, Zheng, Yuanqing, Wei, Xiao-Yong, Liu, Jianwei, Han, Jinsong
Publication Year: 2025
Collection: Computer Science
Subject Terms: Computer Science - Machine Learning, Computer Science - Artificial Intelligence
More Details: Federated Learning (FL) facilitates collaborative training of a shared global model without exposing clients' private data. In practical FL systems, clients (e.g., edge servers, smartphones, and wearables) typically have disparate system resources. Conventional FL, however, adopts a one-size-fits-all solution, where a homogeneous large global model is transmitted to and trained on each client, resulting in an overwhelming workload for less capable clients and starvation for other clients. To address this issue, we propose FedConv, a client-friendly FL framework, which minimizes the computation and memory burden on resource-constrained clients by providing heterogeneous customized sub-models. FedConv features a novel learning-on-model paradigm that learns the parameters of the heterogeneous sub-models via convolutional compression. Unlike traditional compression methods, the compressed models in FedConv can be directly trained on clients without decompression. To aggregate the heterogeneous sub-models, we propose transposed convolutional dilation to convert them back to large models with a unified size while retaining personalized information from clients. The compression and dilation processes, transparent to clients, are optimized on the server leveraging a small public dataset. Extensive experiments on six datasets demonstrate that FedConv outperforms state-of-the-art FL systems in terms of model accuracy (by more than 35% on average), computation and communication overhead (with 33% and 25% reduction, respectively).
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2502.20639
Accession Number: edsarx.2502.20639
Database: arXiv
More Details
Description not available.