skip to main content

Title: Mitigating Large Background in Jet Substructure Observables
We discuss a new approach for jet physics analysis by using subtraction between cumulants of jet substructure observables. The subtracted cumulants are insensitive to soft-particle background uncorrelated with the hard process and allow comparisons between theoretical results and experimental measurements without the complication of soft background like underlying and pile-up events. We find our method using jet mass cumulants efficiently eliminates the background in Monte Carlo simulations and ATLAS jet mass measurements and they show a good agreement with our analytic calculations performed using soft-collinear effective theory.
Authors:
; ; ; ; ; ; ; ; ;
Award ID(s):
1915093
Publication Date:
NSF-PAR ID:
10165379
Journal Name:
EPJ Web of Conferences
Volume:
235
Page Range or eLocation-ID:
05002
ISSN:
2100-014X
Sponsoring Org:
National Science Foundation
More Like this
  1. A bstract One of the key tasks of any particle collider is measurement. In practice, this is often done by fitting data to a simulation, which depends on many parameters. Sometimes, when the effects of varying different parameters are highly correlated, a large ensemble of data may be needed to resolve parameter-space degeneracies. An important example is measuring the top-quark mass, where other physical and unphysical parameters in the simulation must be profiled when fitting the top-quark mass parameter. We compare four different methodologies for top-quark mass measurement: a classical histogram fit similar to one commonly used in experiment augmented by soft-drop jet grooming; a 2D profile likelihood fit with a nuisance parameter; a machine-learning method called DCTR; and a linear regression approach, either using a least-squares fit or with a dense linearly-activated neural network. Despite the fact that individual events are totally uncorrelated, we find that the linear regression methods work most effectively when we input an ensemble of events sorted by mass, rather than training them on individual events. Although all methods provide robust extraction of the top-quark mass parameter, the linear network does marginally best and is remarkably simple. For the top study, we conclude that themore »Monte-Carlo-based uncertainty on current extractions of the top-quark mass from LHC data can be reduced significantly (by perhaps a factor of 2) using networks trained on sorted event ensembles. More generally, machine learning from ensembles for parameter estimation has broad potential for collider physics measurements.« less
  2. Abstract The Lyα forest provides one of the best means of mapping large-scale structure at high redshift, including our tightest constraint on the distance-redshift relation before cosmic noon. We describe how the large-scale correlations in the Lyα forest can be understood as an expansion in cumulants of the optical depth field, which itself can be related to the density field by a bias expansion. This provides a direct connection between the observable and the statistics of the matter fluctuations which can be computed in a systematic manner. We discuss the way in which complex, small-scale physics enters the predictions, the origin of the much-discussed velocity bias and the `renormalization' of the large-scale bias coefficients. Our calculations are within the context of perturbation theory, but we also make contact with earlier work using the peak-background split. Using the structure of the equations of motion we demonstrate, to all orders in perturbation theory, that the large-scale flux power spectrum becomes the linear spectrum times the square of a quadratic in the cosine of the angle to the line of sight. Unlike the case of galaxies, both the isotropic and anisotropic pieces receive contributions from small-scale physics.
  3. A bstract The ALICE Collaboration reports the first fully-corrected measurements of the N -subjettiness observable for track-based jets in heavy-ion collisions. This study is performed using data recorded in pp and Pb-Pb collisions at centre-of-mass energies of $$ \sqrt{s} $$ s = 7 TeV and $$ \sqrt{s_{\mathrm{NN}}} $$ s NN = 2 . 76 TeV, respectively. In particular the ratio of 2-subjettiness to 1-subjettiness, τ 2 /τ 1 , which is sensitive to the rate of two-pronged jet substructure, is presented. Energy loss of jets traversing the strongly interacting medium in heavy-ion collisions is expected to change the rate of two-pronged substructure relative to vacuum. The results are presented for jets with a resolution parameter of R = 0 . 4 and charged jet transverse momentum of 40 ≤ p T , jet ≤ 60 GeV/ c , which constitute a larger jet resolution and lower jet transverse momentum interval than previous measurements in heavy-ion collisions. This has been achieved by utilising a semi-inclusive hadron-jet coincidence technique to suppress the larger jet combinatorial background in this kinematic region. No significant modification of the τ 2 /τ 1 observable for track-based jets in Pb-Pb collisions is observed relative to vacuum PYTHIA6more »and PYTHIA8 references at the same collision energy. The measurements of τ 2 /τ 1 , together with the splitting aperture angle ∆ R , are also performed in pp collisions at $$ \sqrt{s} $$ s = 7 TeV for inclusive jets. These results are compared with PYTHIA calculations at $$ \sqrt{s} $$ s = 7 TeV, in order to validate the model as a vacuum reference for the Pb-Pb centre-of-mass energy. The PYTHIA references for τ 2 /τ 1 are shifted to larger values compared to the measurement in pp collisions. This hints at a reduction in the rate of two-pronged jets in Pb-Pb collisions compared to pp collisions.« less
  4. In this paper, we study the fragmentation of a heavy quark into a jet near threshold, meaning that final state jet carries most of the energy of the fragmenting heavy quark. Using the heavy quark fragmentation function, we simultaneously resum large logarithms of the jet radius R and 1 − z, where z is the ratio of the jet energy to the initiating heavy quark energy. There are numerically significant corrections to the leading order rate due to this resummation. We also investigate the heavy quark fragmentation to a groomed jet, using the soft drop grooming algorithm as an example. In order to do so, we introduce a collinear-ultrasoft mode sensitive to the grooming region determined by the algorithm’s zcut parameter. This allows us to resum large logarithms of zcut/(1−z), again leading to large numerical corrections near the endpoint. A nice feature of the analysis of the heavy quark fragmenting to a groomed jet is the heavy quark mass m renders the algorithm infrared finite, allowing a perturbative calculation. We analyze this for EJ R ∼ m and EJ R ≫ m, where EJ is the jet energy. To do the latter case, we introduce an ultracollinear-soft mode, allowing usmore »to resum large logarithms of EJ R/m. Finally, as an application we calculate the rate for e+e− collisions to produce a heavy quark jet in the endpoint region, where we show that grooming effects have a sizable contribution near the endpoint.« less
  5. The practice of serial X-ray crystallography (SX) depends on efficient, continuous delivery of hydrated protein crystals while minimizing background scattering. Of the two major types of sample delivery devices, fixed-target devices offer several advantages over widely adopted jet injectors, including: lower sample consumption, clog-free delivery, and the ability to control on-chip crystal density to improve hit rates. Here we present our development of versatile, inexpensive, and robust polymer microfluidic chips for routine and reliable room temperature serial measurements at both synchrotrons and X-ray free electron lasers (XFELs). Our design includes highly X-ray-transparent enclosing thin film layers tuned to minimize scatter background, adaptable sample flow layers tuned to match crystal size, and a large sample area compatible with both raster scanning and rotation based serial data collection. The optically transparent chips can be used both for in situ protein crystallization (to eliminate crystal handling) or crystal slurry loading, with prepared samples stable for weeks in a humidified environment and for several hours in ambient conditions. Serial oscillation crystallography, using a multi-crystal rotational data collection approach, at a microfocus synchrotron beamline (SSRL, beamline 12-1) was used to benchmark the performance of the chips. High-resolution structures (1.3–2.7 Å) were collected from five differentmore »proteins – hen egg white lysozyme, thaumatin, bovine liver catalase, concanavalin-A (type VI), and SARS-CoV-2 nonstructural protein NSP5. Overall, our modular fabrication approach enables precise control over the cross-section of materials in the X-ray beam path and facilitates chip adaption to different sample and beamline requirements for user-friendly, straightforward diffraction measurements at room temperature.« less