Attention to the strengths of physical interactions: Transformer and graph-based event classification for particle physics experiments

Bibliographic Details
Title: Attention to the strengths of physical interactions: Transformer and graph-based event classification for particle physics experiments
Authors: Builtjes, Luc, Caron, Sascha, Moskvitina, Polina, Nellist, Clara, de Austri, Roberto Ruiz, Verheyen, Rob, Zhang, Zhongyi
Publication Year: 2022
Collection: High Energy Physics - Experiment
High Energy Physics - Phenomenology
Subject Terms: High Energy Physics - Phenomenology, High Energy Physics - Experiment
More Details: A major task in particle physics is the measurement of rare signal processes. Even modest improvements in background rejection, at a fixed signal efficiency, can significantly enhance the measurement sensitivity. Building on prior research by others that incorporated physical symmetries into neural networks, this work extends those ideas to include additional physics-motivated features. Specifically, we introduce energy-dependent particle interaction strengths, derived from leading-order SM predictions, into modern deep learning architectures, including Transformer Architectures (Particle Transformer), and Graph Neural Networks (Particle Net). These interaction strengths, represented as the SM interaction matrix, are incorporated into the attention matrix (transformers) and edges (graphs). Our results in event classification show that the integration of all physics-motivated features improves background rejection by $10\%-40\%$ over baseline models, with an additional gain of approximately $10\%$ (absolute) due to the SM interaction matrix. This study also provides one of the broadest comparisons of event classifiers to date, demonstrating how various architectures perform across this task. A simplified statistical analysis demonstrates that these enhanced architectures yield significant improvements in signal significance compared to a graph network baseline.
Comment: Version 3 of this paper supersedes arXiv:2211.05143v2. 36 pages, 6 figures
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2211.05143
Accession Number: edsarx.2211.05143
Database: arXiv
More Details
Description not available.