Nearest-Neighbours Neural Network architecture for efficient sampling of statistical physics models
Title: | Nearest-Neighbours Neural Network architecture for efficient sampling of statistical physics models |
---|---|
Authors: | Del Bono, Luca Maria, Ricci-Tersenghi, Federico, Zamponi, Francesco |
Publication Year: | 2024 |
Collection: | Condensed Matter Physics (Other) |
Subject Terms: | Condensed Matter - Disordered Systems and Neural Networks, Condensed Matter - Statistical Mechanics, Physics - Computational Physics |
More Details: | The task of sampling efficiently the Gibbs-Boltzmann distribution of disordered systems is important both for the theoretical understanding of these models and for the solution of practical optimization problems. Unfortunately, this task is known to be hard, especially for spin glasses at low temperatures. Recently, many attempts have been made to tackle the problem by mixing classical Monte Carlo schemes with newly devised Neural Networks that learn to propose smart moves. In this article we introduce the Nearest-Neighbours Neural Network (4N) architecture, a physically-interpretable deep architecture whose number of parameters scales linearly with the size of the system and that can be applied to a large variety of topologies. We show that the 4N architecture can accurately learn the Gibbs-Boltzmann distribution for the two-dimensional Edwards-Anderson model, and specifically for some of its most difficult instances. In particular, it captures properties such as the energy, the correlation function and the overlap probability distribution. Finally, we show that the 4N performance increases with the number of layers, in a way that clearly connects to the correlation length of the system, thus providing a simple and interpretable criterion to choose the optimal depth. Comment: 10 pages, 6 figures; SI: 3 pages |
Document Type: | Working Paper |
Access URL: | http://arxiv.org/abs/2407.19483 |
Accession Number: | edsarx.2407.19483 |
Database: | arXiv |
Description not available. |