Speaker
Description
A recent proposal suggests using autoregressive neural networks to approximate multi-dimensional probability distributions found in lattice field theories or statistical mechanics. Unlike Monte Carlo algorithms, these networks can serve as variational approximators to evaluate extensive properties of statistical systems, such as free energy.
In the case of two-dimensional systems, the numerical cost of such simulations scales like $L^6$ with increasing size $L$ of $L \times L$ system and can be reduced down to $L^3$ using a hierarchy of autoregressive neural networks.
In this poster, we will show the generalization of the two-dimensional hierarchical algorithm to three-dimensional Ising model $L \times L \times L$, which cost scales with $L^6$ instead of the expected $L^9$. We present conducted simulations on lattices of diverse sizes, including up to $16 \times 16 \times 16$ spins. We also show various algorithms that allow us to train our networks faster.
Our proposed approach improves neural network training, yielding a closer approximation of the target probability distribution, leading to a more accurate variational free energy, reduced autocorrelation time in Markov Chain Monte Carlo simulations, and decreased memory requirements through the use of a hierarchical network structure.