Conveners
4.3 Physics-informed AI, Foundation models and related techniques
- Johan Messchendorp (GSI Helmholtzzentrum für Schwerionenforschung GmbH(GSI))
Presentation materials
Foundation models are increasingly prominent in various physics subfields. Moreover, the application of supervised machine learning methods in astronomy suffers from scarce training data. We explore computer vision foundation models, focusing on their application to radio astronomical image data.
Specifically, we explore the unsupervised, morphological classification of radio sources through...
In this work we demonstrate that significant gains in performance and data efficiency can be achieved moving beyond the standard paradigm of sequential optimization in High Energy Physics (HEP). We conceptually connect HEP reconstruction and analysis to modern machine learning workflows such as pretraining, finetuning, domain adaptation and high-dimensional embedding spaces and quantify the...
In any lattice QCD based study, gauge configurations have to be generated using some form of Monte Carlo simulations. These are then used to compute physical observables. In these measurements, physical observables (like the chiral condensate or baryon number density) can be expressed as a trace of a combination of products of the inverse fermion matrix. These traces are usually estimated...
The Bert pretraining paradigm has proven to be highly effective in many domains including natural language processing, image processing and biology. To apply the Bert paradigm the data needs to be described as a set of tokens, and each token needs to be labelled. To date the Bert paradigm has not been explored in the context of HEP. The samples that form the data used in HEP can be described...
We present a newly developed code, JERALD - JAX Enhanced Resolution Approximate Lagrangian Dynamics -, that builds on the Lagrangian Deep Learning method (LDL) of Dai and Seljak (2021), improving on the time and the memory requirements of the original code. JERALD takes as input DM particle positions from a low-resolution, computationally inexpensive run of the approximate N-body simulator...
Traditionally, searches for new physics use complex computer simulations to reproduce what Standard Model processes should look like in collisions recorded by the LHC experiments. These are then compared to simulations of new-physics models (e.g. dark matter, supersymmetry, etc.).
The lack of evidence for new interactions and particles since the Higgs boson’s discovery has motivated the...
Foundation models are multi-dataset and multi-task machine learning methods that once pre-trained can be fine-tuned for a large variety of downstream applications. The successful development of such general-purpose models for physics data would be a major breakthrough as they could improve the achievable physics performance while at the same time drastically reduce the required amount of...
A recent proposal suggests using autoregressive neural networks to approximate multi-dimensional probability distributions found in lattice field theories or statistical mechanics. Unlike Monte Carlo algorithms, these networks can serve as variational approximators to evaluate extensive properties of statistical systems, such as free energy.
In the case of two-dimensional systems, the...