The measurement of the charge asymmetry for highly boosted top quark pairs decaying to a single lepton and jets is presented. The analysis is performed using 138 fb−1 of data collected in pp collisions at s√=13 TeV with the CMS detector during Run 2 of the Large Hadron Collider. The selection is optimized for top quark-antiquark pairs produced with large Lorentz boosts, resulting in non-isolated leptons and overlapping jets. The top quark charge asymmetry is measured for events with tt⎯⎯ invariant mass larger than 750 GeV and corrected for detector and acceptance effects using a binned maximum likelihood fit. The measured top quark charge asymmetry is in good agreement with the standard model prediction at next-to-next-to-leading order in perturbation theory with next-to-leading order electroweak corrections. Differential distributions for two invariant mass ranges are also presented.
more »
« less
Parameter inference from event ensembles and the top-quark mass
A bstract One of the key tasks of any particle collider is measurement. In practice, this is often done by fitting data to a simulation, which depends on many parameters. Sometimes, when the effects of varying different parameters are highly correlated, a large ensemble of data may be needed to resolve parameter-space degeneracies. An important example is measuring the top-quark mass, where other physical and unphysical parameters in the simulation must be profiled when fitting the top-quark mass parameter. We compare four different methodologies for top-quark mass measurement: a classical histogram fit similar to one commonly used in experiment augmented by soft-drop jet grooming; a 2D profile likelihood fit with a nuisance parameter; a machine-learning method called DCTR; and a linear regression approach, either using a least-squares fit or with a dense linearly-activated neural network. Despite the fact that individual events are totally uncorrelated, we find that the linear regression methods work most effectively when we input an ensemble of events sorted by mass, rather than training them on individual events. Although all methods provide robust extraction of the top-quark mass parameter, the linear network does marginally best and is remarkably simple. For the top study, we conclude that the Monte-Carlo-based uncertainty on current extractions of the top-quark mass from LHC data can be reduced significantly (by perhaps a factor of 2) using networks trained on sorted event ensembles. More generally, machine learning from ensembles for parameter estimation has broad potential for collider physics measurements.
more »
« less
- Award ID(s):
- 2019786
- PAR ID:
- 10299647
- Date Published:
- Journal Name:
- Journal of High Energy Physics
- Volume:
- 2021
- Issue:
- 9
- ISSN:
- 1029-8479
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
A bstract A measurement of four-top-quark production using proton-proton collision data at a centre-of-mass energy of 13 TeV collected by the ATLAS detector at the Large Hadron Collider corresponding to an integrated luminosity of 139 fb − 1 is presented. Events are selected if they contain a single lepton (electron or muon) or an opposite-sign lepton pair, in association with multiple jets. The events are categorised according to the number of jets and how likely these are to contain b -hadrons. A multivariate technique is then used to discriminate between signal and background events. The measured four-top-quark production cross section is found to be $$ {26}_{-15}^{+17} $$ 26 − 15 + 17 fb, with a corresponding observed (expected) significance of 1.9 (1.0) standard deviations over the background-only hypothesis. The result is combined with the previous measurement performed by the ATLAS Collaboration in the multilepton final state. The combined four-top-quark production cross section is measured to be $$ {24}_{-6}^{+7} $$ 24 − 6 + 7 fb, with a corresponding observed (expected) signal significance of 4.7 (2.6) standard deviations over the background-only predictions. It is consistent within 2.0 standard deviations with the Standard Model expectation of 12 . 0 ± 2 . 4 fb.more » « less
-
The explosive growth in supercomputers capacity has changed simulation paradigms. Simulations have shifted from a few lengthy ones to an ensemble of multiple simulations with varying initial conditions or input parameters. Thus, an ensemble consists of large volumes of multi-dimensional data that could go beyond the exascale boundaries. However, the disparity in growth rates between storage capabilities and computing resources results in I/O bottlenecks. This makes it impractical to utilize conventional postprocessing and visualization tools for analyzing such massive simulation ensembles. In situ visualization approaches alleviate I/O constraints by saving predetermined visualizations in image databases during simulation. Nevertheless, the unavailability of output raw data restricts the flexibility of post hoc exploration of in situ approaches. Much research has been conducted to mitigate this limitation, but it falls short when it comes to simultaneously exploring and analyzing parameter and ensemble spaces. In this paper, we propose an expert-in-the-loop visual exploration analytic approach. The proposed approach leverages: feature extraction, deep learning, and human expert–AI collaboration techniques to explore and analyze image-based ensembles. Our approach utilizes local features and deep learning techniques to learn the image features of ensemble members. The extracted features are then combined with simulation input parameters and fed to the visualization pipeline for in-depth exploration and analysis using human expert + AI interaction techniques. We show the effectiveness of our approach using several scientific simulation ensembles.more » « less
-
A<sc>bstract</sc> PELICAN is a novel permutation equivariant and Lorentz invariant or covariant aggregator network designed to overcome common limitations found in architectures applied to particle physics problems. Compared to many approaches that use non-specialized architectures that neglect underlying physics principles and require very large numbers of parameters, PELICAN employs a fundamentally symmetry group-based architecture that demonstrates benefits in terms of reduced complexity, increased interpretability, and raw performance. We present a comprehensive study of the PELICAN algorithm architecture in the context of both tagging (classification) and reconstructing (regression) Lorentz-boosted top quarks, including the difficult task of specifically identifying and measuring theW-boson inside the dense environment of the Lorentz-boosted top-quark hadronic final state. We also extend the application of PELICAN to the tasks of identifying quark-initiated vs. gluon-initiated jets, and a multi-class identification across five separate target categories of jets. When tested on the standard task of Lorentz-boosted top-quark tagging, PELICAN outperforms existing competitors with much lower model complexity and high sample efficiency. On the less common and more complex task of 4-momentum regression, PELICAN also outperforms hand-crafted, non-machine learning algorithms. We discuss the implications of symmetry-restricted architectures for the wider field of machine learning for physics.more » « less
-
Abstract The mass of the top quark is measured in 36.3$$\,\text {fb}^{-1}$$ of LHC proton–proton collision data collected with the CMS detector at$$\sqrt{s}=13\,\text {Te}\hspace{-.08em}\text {V} $$ . The measurement uses a sample of top quark pair candidate events containing one isolated electron or muon and at least four jets in the final state. For each event, the mass is reconstructed from a kinematic fit of the decay products to a top quark pair hypothesis. A profile likelihood method is applied using up to four observables per event to extract the top quark mass. The top quark mass is measured to be$$171.77\pm 0.37\,\text {Ge}\hspace{-.08em}\text {V} $$ . This approach significantly improves the precision over previous measurements.more » « less
An official website of the United States government

