Speaker
Description
Our primary objective is to achieve a pioneering measurement of the challenging $gg\rightarrow ZH$ process in Large Hadron Collider (LHC) data to extract new physics contributions in the context of the Standard Model Effective Field Theory (SMEFT) framework. By leveraging the power of multi-head attention mechanism within Transformer encoders, we developed an innovative approach to efficiency capture long-range dependencies and contextual information in sequences of particle-collision-event final-state objects. This new technique enhances our ability to extract SMEFT parameters that are not well constrained by other measurements and deepens our understanding of fundamental interactions within the Higgs-boson sector. This presentation showcases the versatility of Transformer networks beyond their original domain and presents new opportunities for advanced data-driven physics research at the LHC.