Federated Learning with Reduced Information Leakage and Computation

Bibliographic Details
Title: Federated Learning with Reduced Information Leakage and Computation
Authors: Yin, Tongxin, Tan, Xuwei, Zhang, Xueru, Khalili, Mohammad Mahdi, Liu, Mingyan
Publication Year: 2023
Collection: Computer Science
Subject Terms: Computer Science - Machine Learning
More Details: Federated learning (FL) is a distributed learning paradigm that allows multiple decentralized clients to collaboratively learn a common model without sharing local data. Although local data is not exposed directly, privacy concerns nonetheless exist as clients' sensitive information can be inferred from intermediate computations. Moreover, such information leakage accumulates substantially over time as the same data is repeatedly used during the iterative learning process. As a result, it can be particularly difficult to balance the privacy-accuracy trade-off when designing privacy-preserving FL algorithms. This paper introduces Upcycled-FL, a simple yet effective strategy that applies first-order approximation at every even round of model update. Under this strategy, half of the FL updates incur no information leakage and require much less computational and transmission costs. We first conduct the theoretical analysis on the convergence (rate) of Upcycled-FL and then apply two perturbation mechanisms to preserve privacy. Extensive experiments on both synthetic and real-world data show that the Upcycled-FL strategy can be adapted to many existing FL frameworks and consistently improve the privacy-accuracy trade-off.
Comment: Accepted by Transactions on Machine Learning Research (TMLR)
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2310.06341
Accession Number: edsarx.2310.06341
Database: arXiv
More Details
Description not available.