Title: |
Real World Federated Learning with a Knowledge Distilled Transformer for Cardiac CT Imaging |
Authors: |
Tölle, Malte, Garthe, Philipp, Scherer, Clemens, Seliger, Jan Moritz, Leha, Andreas, Krüger, Nina, Simm, Stefan, Martin, Simon, Eble, Sebastian, Kelm, Halvar, Bednorz, Moritz, André, Florian, Bannas, Peter, Diller, Gerhard, Frey, Norbert, Groß, Stefan, Hennemuth, Anja, Kaderali, Lars, Meyer, Alexander, Nagel, Eike, Orwat, Stefan, Seiffert, Moritz, Friede, Tim, Seidler, Tim, Engelhardt, Sandy |
Publication Year: |
2024 |
Collection: |
Computer Science |
Subject Terms: |
Electrical Engineering and Systems Science - Image and Video Processing, Computer Science - Computer Vision and Pattern Recognition |
More Details: |
Federated learning is a renowned technique for utilizing decentralized data while preserving privacy. However, real-world applications often face challenges like partially labeled datasets, where only a few locations have certain expert annotations, leaving large portions of unlabeled data unused. Leveraging these could enhance transformer architectures ability in regimes with small and diversely annotated sets. We conduct the largest federated cardiac CT analysis to date (n=8,104) in a real-world setting across eight hospitals. Our two-step semi-supervised strategy distills knowledge from task-specific CNNs into a transformer. First, CNNs predict on unlabeled data per label type and then the transformer learns from these predictions with label-specific heads. This improves predictive accuracy and enables simultaneous learning of all partial labels across the federation, and outperforms UNet-based models in generalizability on downstream tasks. Code and model weights are made openly available for leveraging future cardiac CT analysis. |
Document Type: |
Working Paper |
Access URL: |
http://arxiv.org/abs/2407.07557 |
Accession Number: |
edsarx.2407.07557 |
Database: |
arXiv |