VFL-Cafe: Communication-Efficient Vertical Federated Learning via Dynamic Caching and Feature Selection

Bibliographic Details
Title: VFL-Cafe: Communication-Efficient Vertical Federated Learning via Dynamic Caching and Feature Selection
Authors: Jiahui Zhou, Han Liang, Tian Wu, Xiaoxi Zhang, Yu Jiang, Chee Wei Tan
Source: Entropy, Vol 27, Iss 1, p 66 (2025)
Publisher Information: MDPI AG, 2025.
Publication Year: 2025
Collection: LCC:Science
LCC:Astrophysics
LCC:Physics
Subject Terms: vertical federated learning, communication efficient, feature selection, dynamic caching, Science, Astrophysics, QB460-466, Physics, QC1-999
More Details: Vertical Federated Learning (VFL) is a promising category of Federated Learning that enables collaborative model training among distributed parties with data privacy protection. Due to its unique training architecture, a key challenge of VFL is high communication cost due to transmitting intermediate results between the Active Party and Passive Parties. Current communication-efficient VFL methods rely on using stale results without meticulous selection, which can impair model accuracy, particularly in noisy data environments. To address these limitations, this work proposes VFL-Cafe, a new VFL training method that leverages dynamic caching and feature selection to boost communication efficiency and model accuracy. In each communication round, the employed caching scheme allows multiple batches of intermediate results to be cached and strategically reused by different parties, reducing the communication overhead while maintaining model accuracy. Additionally, to eliminate the negative impact of noisy features that may undermine the effectiveness of using stale results to reduce communication rounds and incur significant model degradation, a feature selection strategy is integrated into each round of local updates. Theoretical analysis is then conducted to provide guidance on cache configuration, optimizing performance. Finally, extensive experimental results validate VFL-Cafe’s efficacy, demonstrating remarkable improvements in communication efficiency and model accuracy.
Document Type: article
File Description: electronic resource
Language: English
ISSN: 27010066
1099-4300
Relation: https://www.mdpi.com/1099-4300/27/1/66; https://doaj.org/toc/1099-4300
DOI: 10.3390/e27010066
Access URL: https://doaj.org/article/f96d0191d58f44e08cd2d0934b6ca7f5
Accession Number: edsdoj.f96d0191d58f44e08cd2d0934b6ca7f5
Database: Directory of Open Access Journals
Full text is not displayed to guests.
More Details
ISSN:27010066
10994300
DOI:10.3390/e27010066
Published in:Entropy
Language:English