Title: |
LayerFold: A Python library to reduce the depth of neural networks |
Authors: |
Giommaria Pilo, Nour Hezbri, André Pereira e Ferreira, Victor Quétu, Enzo Tartaglione |
Source: |
SoftwareX, Vol 29, Iss , Pp 102030- (2025) |
Publisher Information: |
Elsevier, 2025. |
Publication Year: |
2025 |
Collection: |
LCC:Computer software |
Subject Terms: |
Deep learning, Layer collapse, Depth compression, Pruning, PyTorch, Computer software, QA76.75-76.765 |
More Details: |
Large-scale models are the backbone of Computer Vision and Natural Language Processing, and their generalizability allows for transfer learning and deployment in different scenarios. However, their large size means that reducing their computational and memory demands remains a challenge. Recent research proposes to achieve “layer collapse”, a condition where multiple layers can be combined due to the collapse of non-linearities to linear operators. While this is an important discovery, most studies remain theoretical, often replacing non-linearities with simple identity functions and not providing a real implementation of the more compact architecture. Our contribution is LayerFold, a library that studies and implements the merging of collapsed layers. We address typical cases, from fully connected to convolutional layers, discussing constraints and prospective challenges. Our tests on edge devices reveal that merely reducing network depth does not always result in faster computation, even when GPU-equipped. This work raises important warnings and opens the door to further advances in efficient model deployment. |
Document Type: |
article |
File Description: |
electronic resource |
Language: |
English |
ISSN: |
2352-7110 |
Relation: |
http://www.sciencedirect.com/science/article/pii/S2352711024004011; https://doaj.org/toc/2352-7110 |
DOI: |
10.1016/j.softx.2024.102030 |
Access URL: |
https://doaj.org/article/2687fa41edf743578b8203a317c1c546 |
Accession Number: |
edsdoj.2687fa41edf743578b8203a317c1c546 |
Database: |
Directory of Open Access Journals |