30 April 2024 to 3 May 2024
Amsterdam, Hotel CASA
Europe/Amsterdam timezone

Transformer-inspired models for particle track reconstruction

30 Apr 2024, 17:19
3m
UvA 1, Hotel CASA

UvA 1, Hotel CASA

Flashtalk with Poster Session B 3.4 Foundation models and related techniques

Speaker

Yue Zhao (SURF, the Netherlands)

Description

Particle track reconstruction is a fundamental aspect of experimental analysis in high-energy particle physics. Conventional methodologies for track reconstruction are suboptimal in terms of efficiency in anticipation of the High Luminosity phase of the Large Hadron Collider. This has motivated researchers to explore the latest developments in deep learning for their scalability and potential enhanced inference efficiency.

We assess the feasibility of three Transformer-inspired model architectures for hit clustering and classification. The first model uses an encoder-decoder architecture to reconstruct a track auto-regressively, given the coordinates of the first few hits. The second model employs an encoder-only architecture as a classifier, using predefined labels for each track. The third model, also utilising an encoder-only configuration, regresses track parameters, and subsequently assigns clusters in the track parameter space to individual tracks.

We discuss preliminary studies on a simplified dataset, showing high success rates for all models under consideration, alongside our latest results using the TrackML dataset from the 2018 Kaggle challenge. Additionally, we present our journey in the adaptation of models and training strategies, addressing the tradeoffs among training efficiency, accuracy, and the optimisation of sequence lengths within the memory constraints of the hardware at our disposal.

Primary author

Yue Zhao (SURF, the Netherlands)

Co-authors

Presentation materials