Improving Deep Assertion Generation via Fine-Tuning Retrieval-Augmented Pre-trained Language Models

Bibliographic Details
Title: Improving Deep Assertion Generation via Fine-Tuning Retrieval-Augmented Pre-trained Language Models
Authors: Zhang, Quanjun, Fang, Chunrong, Zheng, Yi, Zhang, Yaxin, Zhao, Yuan, Huang, Rubing, Zhou, Jianyi, Yang, Yun, Zheng, Tao, Chen, Zhenyu
Publication Year: 2025
Collection: Computer Science
Subject Terms: Computer Science - Software Engineering
More Details: Unit testing validates the correctness of the units of the software system under test and serves as the cornerstone in improving software quality and reliability. To reduce manual efforts in writing unit tests, some techniques have been proposed to automatically generate test assertions, with recent integration-based approaches considered state-of-the-art. Despite being promising, such integration-based approaches face several limitations, including reliance on lexical matching for assertion retrieval and a limited training corpus for assertion generation. This paper proposes a novel retrieval-augmented deep assertion generation approach, namely RetriGen, based on a hybrid retriever and a pre-trained language model (PLM)-based generator. Given a focal-test, RetriGen first builds a hybrid assertion retriever to search for the most relevant Test-Assert Pair from external codebases. The retrieval process considers lexical similarity and semantical similarity via a token-based and an embedding-based retriever, respectively. RetriGen then treats assertion generation as a sequence-to-sequence task and designs a PLM-based assertion generator to predict a correct assertion. We conduct extensive experiments to evaluate RetriGen against six state-of-the-art approaches across two large-scale datasets and two metrics. The results demonstrate that RetriGen achieves 57.66% accuracy and 73.24% CodeBLEU, outperforming all baselines with average improvements of 50.66% and 14.14%, respectively.
Comment: Accepted to ACM Transactions on Software Engineering and Methodology (TOSEM 2025)
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2502.16071
Accession Number: edsarx.2502.16071
Database: arXiv
More Details
Description not available.