Conveners
1.2 Generative models & Simulation of physical systems
- Tobias Golling (University of Geneva)
Presentation materials
We propose a differentiable vertex fitting algorithm that can be used for secondary vertex fitting, and that can be seamlessly integrated into neural networks for jet flavour tagging. Vertex fitting is formulated as an optimization problem where gradients of the optimized solution vertex are defined through implicit differentiation and can be passed to upstream or downstream neural network...
Detected Gravitational Waves are goldmines of information on the compact binary emitting systems. Usually MCMC techniques infer parameter's values in a 15-dimensional parameter space in an accurate way, but they are very lengthy. On the other hand, Physics-Informed Neural Networks (PINNs) are a rapidly emerging branch of Supervised Machine Learning, devoted precisely to solve physical...
Generative models, particularly normalizing flows, have recently been proposed to speed up lattice field theory sample generation. We have explored the role symmetry considerations and ML concepts like transfer learning may have, by applying novel continuous normalizing flows to a scalar field theory. Beyond that, interesting connections exist between renormalization group theory and...
Nested sampling has become an important tool for inference in astronomical data analysis. However, it is often computationally expensive to run. This poses a challenge for certain applications, such as gravitational-wave inference. To address this, we previously introduced nessai, a nested sampling algorithm that incorporates normalizing flows to accelerate gravitational-wave inference by up...
This presentation will highlight the impactful role of machine learning (ML) in high energy nuclear physics, particularly in studying QCD matter under extreme conditions. The presentation will focus on three key applications: analyzing heavy ion collisions, reconstructing neutron star Equation of State (EoS), and advancing lattice field theory studies.
In heavy ion collisions, ML techniques...
Recently, machine learning has become a popular tool in lattice field theory. Here I will report on some applications of (lattice) field theory methods to further understand ML, illustrated using the Restricted Boltzmann Machine and stochastic quantisation as simple examples.
We propose a quantum version of a generative diffusion model. In this algorithm, artificial neural networks are replaced with parameterized quantum circuits, in order to directly generate quantum states. We present both a full quantum and a latent quantum version of the algorithm; we also present a conditioned version of these models. The models' performances have been evaluated using...
Traditionally, machine-learning methods have mostly focused on making predictions without providing explicit probability distributions. However, the importance of predicting probability distributions lies in its understanding of the model’s level of confidence and the range of potential outcomes. Unlike point estimates, which offer a single value, probability distributions offer a range of...
Off-shell effects in large LHC backgrounds are crucial for precision predictions and, at the same time, challenging to simulate. We show how a generative diffusion network learns off-shell kinematics given the much simpler on-shell process. It generates off-shell configurations fast and precisely, while reproducing even challenging on-shell features.
New radio telescopes, such as the SKA, will revolutionise our understanding of the Universe. They can detect the faintest distant galaxies and provide high-resolution observations of nearby galaxies. This allows for detailed statistical studies and insights into the formation and evolution of galaxies across cosmic time. These telescopes also play a crucial role in unravelling the physical...