skip to main content

Search for: All records

Creators/Authors contains: "Cai, H"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Social media platforms are playing increasingly critical roles in disaster response and rescue operations. During emergencies, users can post rescue requests along with their addresses on social media, while volunteers can search for those messages and send help. However, efficiently leveraging social media in rescue operations remains challenging because of the lack of tools to identify rescue request messages on social media automatically and rapidly. Analyzing social media data, such as Twitter data, relies heavily on Natural Language Processing (NLP) algorithms to extract information from texts. The introduction of bidirectional transformers models, such as the Bidirectional Encoder Representations from Transformers (BERT) model, has significantly outperformed previous NLP models in numerous text analysis tasks, providing new opportunities to precisely understand and classify social media data for diverse applications. This study developed and compared ten VictimFinder models for identifying rescue request tweets, three based on milestone NLP algorithms and seven BERT-based. A total of 3191 manually labeled disaster-related tweets posted during 2017 Hurricane Harvey were used as the training and testing datasets. We evaluated the performance of each model by classification accuracy, computation cost, and model stability. Experiment results show that all BERT-based models have significantly increased the accuracy of categorizing rescue-relatedmore »tweets. The best model for identifying rescue request tweets is a customized BERT-based model with a Convolutional Neural Network (CNN) classifier. Its F1-score is 0.919, which outperforms the baseline model by 10.6%. The developed models can promote social media use for rescue operations in future disaster events.« less
    Free, publicly-accessible full text available July 1, 2023
  2. Abstract The Phase-I trigger readout electronics upgrade of the ATLAS Liquid Argon calorimeters enhances thephysics reach of the experiment during the upcoming operation atincreasing Large Hadron Collider luminosities.The new system, installed during the second Large Hadron Collider Long Shutdown,increases the trigger readout granularity by up to a factor of tenas well as its precision and range.Consequently, the background rejection at trigger level is improvedthrough enhanced filtering algorithms utilizing the additional informationfor topological discrimination of electromagnetic and hadronic shower shapes.This paper presents the final designs of the new electronic elements,their custom electronic devices, the proceduresused to validate their proper functioning, and the performance achievedduring the commissioning of this system.
    Free, publicly-accessible full text available May 1, 2023
  3. Abstract Quantum chromodynamics, the theory of the strong force, describes interactions of coloured quarks and gluons and the formation of hadronic matter. Conventional hadronic matter consists of baryons and mesons made of three quarks and quark-antiquark pairs, respectively. Particles with an alternative quark content are known as exotic states. Here a study is reported of an exotic narrow state in the D 0 D 0 π + mass spectrum just below the D *+ D 0 mass threshold produced in proton-proton collisions collected with the LHCb detector at the Large Hadron Collider. The state is consistent with the ground isoscalar $${{{{{{\rm{T}}}}}}}_{{{{{{\rm{c}}}}}}{{{{{\rm{c}}}}}}}^{+}$$ T c c + tetraquark with a quark content of $${{{{{\rm{c}}}}}}{{{{{\rm{c}}}}}}\overline{{{{{{\rm{u}}}}}}}\overline{{{{{{\rm{d}}}}}}}$$ c c u ¯ d ¯ and spin-parity quantum numbers J P  = 1 + . Study of the DD mass spectra disfavours interpretation of the resonance as the isovector state. The decay structure via intermediate off-shell D *+ mesons is consistent with the observed D 0 π + mass distribution. To analyse the mass of the resonance and its coupling to the D * D system, a dedicated model is developed under the assumption of an isoscalar axial-vector $${{{{{{\rm{T}}}}}}}_{{{{{{\rm{c}}}}}}{{{{{\rm{c}}}}}}}^{+}$$ T c c + state decaying to the Dmore »* D channel. Using this model, resonance parameters including the pole position, scattering length, effective range and compositeness are determined to reveal important information about the nature of the $${{{{{{\rm{T}}}}}}}_{{{{{{\rm{c}}}}}}{{{{{\rm{c}}}}}}}^{+}$$ T c c + state. In addition, an unexpected dependence of the production rate on track multiplicity is observed.« less
    Free, publicly-accessible full text available December 1, 2023
  4. Abstract The accurate simulation of additional interactions at the ATLAS experiment for the analysis of proton–proton collisions delivered by the Large Hadron Collider presents a significant challenge to the computing resources. During the LHC Run 2 (2015–2018), there were up to 70 inelastic interactions per bunch crossing, which need to be accounted for in Monte Carlo (MC) production. In this document, a new method to account for these additional interactions in the simulation chain is described. Instead of sampling the inelastic interactions and adding their energy deposits to a hard-scatter interaction one-by-one, the inelastic interactions are presampled, independent of the hard scatter, and stored as combined events. Consequently, for each hard-scatter interaction, only one such presampled event needs to be added as part of the simulation chain. For the Run 2 simulation chain, with an average of 35 interactions per bunch crossing, this new method provides a substantial reduction in MC production CPU needs of around 20%, while reproducing the properties of the reconstructed quantities relevant for physics analyses with good accuracy.
    Free, publicly-accessible full text available December 1, 2023
  5. Abstract The ATLAS experiment at the Large Hadron Collider has a broad physics programme ranging from precision measurements to direct searches for new particles and new interactions, requiring ever larger and ever more accurate datasets of simulated Monte Carlo events. Detector simulation with Geant4 is accurate but requires significant CPU resources. Over the past decade, ATLAS has developed and utilized tools that replace the most CPU-intensive component of the simulation—the calorimeter shower simulation—with faster simulation methods. Here, AtlFast3, the next generation of high-accuracy fast simulation in ATLAS, is introduced. AtlFast3 combines parameterized approaches with machine-learning techniques and is deployed to meet current and future computing challenges, and simulation needs of the ATLAS experiment. With highly accurate performance and significantly improved modelling of substructure within jets, AtlFast3 can simulate large numbers of events for a wide range of physics processes.
    Free, publicly-accessible full text available December 1, 2023