Conveners
2.3 Simulation-based inference
- Roberto Ruiz de Austri
The GeV gamma-ray sky, as observed by the Fermi Large Area Telescope (Fermi LAT), harbours a plethora of localised point-like sources. At high latitudes ($|b| >30^{\circ}$), most of these sources are of extragalactic origin. The source-count distribution as a function of their flux, $\mathrm{d}N/\mathrm{d}S$, is a well-established quantity to summarise this population. We employ sequential...
Type Ia supernovae (SNae Ia) are instrumental in constraining cosmological parameters, particularly dark energy. State-of-the-art likelihood-based analyses scale poorly to future large datasets, are limited to simplified probabilistic descriptions of e.g. peculiar velocities, photometric redshift uncertainties, instrumental noise, and selection effects, and must explicitly sample a...
Forthcoming large-scale structure (LSS) Stage IV surveys will provide us with unprecedented data to probe the nature of dark matter and dark energy. However, analysing these data with conventional Markov Chain Monte Carlo (MCMC) methods will be challenging, due to the increase in the number of nuisance parameters and the presence of intractable likelihoods. In this talk, I discuss the first...
In some sense, the detection of a stochastic gravitational wave background (SGWB) is one of the most subtle GW analysis challenges facing the community in the next-generation detector era. For example, at an experiment such as LISA, to extract the SGWB contributions, we must simultaneously: detect and analyse thousands of highly overlapping sources including massive binary black holes mergers...
COSMOPOWER is a state-of-the-art Machine Learning framework adopted by all major Large-Scale Structure (LSS) and Cosmic Microwave Background (CMB) international collaborations for acceleration of their cosmological inference pipelines. It achieves orders-of-magnitude acceleration by replacing the expensive computation of cosmological power spectra, traditionally performed with a Boltzmann...
With new astronomical surveys, we are entering a data-driven era in cosmology. Modern machine learning methods are up for the task to optimally learn the Universe from low to high redshift. In 3D, tomography of the large-scale structure (LSS) via the 21cm line of hydrogen targeted by the SKA (Square Kilometre Array) can both teach about properties of sources and gaseous media between, while...
This talk presents a novel approach to dark matter direct detection using anomaly-aware machine learning techniques in the DARWIN next-generation dark matter direct detection experiment. I will introduce a semi-unsupervised deep learning pipeline that falls under the umbrella of generalized Simulation-Based Inference (SBI), an approach that allows one to effectively learn likelihoods straight...
PolyChord
was originally advertised encouraging users to experiment with their own clustering algorithms. Identifying clusters of nested sampling live points is critical for PolyChord
to perform nested sampling correctly. We have updated the Python
interface of PolyChordLite
to allow straightforward substitution of different clustering methods.
Recent reconstructions of the...
This study explores the inference of BSM models and their parameters from kinematic distributions of collider signals through an n-channel 1D-Convolutional Neural Network (n1D-CNN). Our approach enables simultaneous inference from distributions of any fixed number of observables. As our training data are computationally expensive simulations, we also introduce a novel data augmentation...
Sensitivity forecasts inform the design of experiments and the direction of theoretical efforts. To arrive at representative results, Bayesian forecasts should marginalize their conclusions over uncertain parameters and noise realizations rather than picking fiducial values. However, this is typically computationally infeasible with current methods for forecasts of an experiment’s ability to...