Speaker
Mathis Gerdes
(University of Amsterdam)
Description
Generative models, particularly normalizing flows, have recently been proposed to speed up lattice field theory sample generation. We have explored the role symmetry considerations and ML concepts like transfer learning may have, by applying novel continuous normalizing flows to a scalar field theory. Beyond that, interesting connections exist between renormalization group theory and generative models, as pointed out in recent papers, which should be further explored.
Primary author
Mathis Gerdes
(University of Amsterdam)
Co-authors
Pim de Haan
(UvA, Qualcomm)
Corrado Rainone
(Qualcomm)
Roberto Bondesan
(Qualcomm)
Miranda Cheng
(UvA, Academia Sinica)