skip to main content

Search for: All records

Creators/Authors contains: "Zhou, B."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Social media platforms are playing increasingly critical roles in disaster response and rescue operations. During emergencies, users can post rescue requests along with their addresses on social media, while volunteers can search for those messages and send help. However, efficiently leveraging social media in rescue operations remains challenging because of the lack of tools to identify rescue request messages on social media automatically and rapidly. Analyzing social media data, such as Twitter data, relies heavily on Natural Language Processing (NLP) algorithms to extract information from texts. The introduction of bidirectional transformers models, such as the Bidirectional Encoder Representations from Transformers (BERT) model, has significantly outperformed previous NLP models in numerous text analysis tasks, providing new opportunities to precisely understand and classify social media data for diverse applications. This study developed and compared ten VictimFinder models for identifying rescue request tweets, three based on milestone NLP algorithms and seven BERT-based. A total of 3191 manually labeled disaster-related tweets posted during 2017 Hurricane Harvey were used as the training and testing datasets. We evaluated the performance of each model by classification accuracy, computation cost, and model stability. Experiment results show that all BERT-based models have significantly increased the accuracy of categorizing rescue-relatedmore »tweets. The best model for identifying rescue request tweets is a customized BERT-based model with a Convolutional Neural Network (CNN) classifier. Its F1-score is 0.919, which outperforms the baseline model by 10.6%. The developed models can promote social media use for rescue operations in future disaster events.« less
    Free, publicly-accessible full text available July 1, 2023
  2. Abstract The accurate simulation of additional interactions at the ATLAS experiment for the analysis of proton–proton collisions delivered by the Large Hadron Collider presents a significant challenge to the computing resources. During the LHC Run 2 (2015–2018), there were up to 70 inelastic interactions per bunch crossing, which need to be accounted for in Monte Carlo (MC) production. In this document, a new method to account for these additional interactions in the simulation chain is described. Instead of sampling the inelastic interactions and adding their energy deposits to a hard-scatter interaction one-by-one, the inelastic interactions are presampled, independent of the hard scatter, and stored as combined events. Consequently, for each hard-scatter interaction, only one such presampled event needs to be added as part of the simulation chain. For the Run 2 simulation chain, with an average of 35 interactions per bunch crossing, this new method provides a substantial reduction in MC production CPU needs of around 20%, while reproducing the properties of the reconstructed quantities relevant for physics analyses with good accuracy.
    Free, publicly-accessible full text available December 1, 2023
  3. Abstract The ATLAS experiment at the Large Hadron Collider has a broad physics programme ranging from precision measurements to direct searches for new particles and new interactions, requiring ever larger and ever more accurate datasets of simulated Monte Carlo events. Detector simulation with Geant4 is accurate but requires significant CPU resources. Over the past decade, ATLAS has developed and utilized tools that replace the most CPU-intensive component of the simulation—the calorimeter shower simulation—with faster simulation methods. Here, AtlFast3, the next generation of high-accuracy fast simulation in ATLAS, is introduced. AtlFast3 combines parameterized approaches with machine-learning techniques and is deployed to meet current and future computing challenges, and simulation needs of the ATLAS experiment. With highly accurate performance and significantly improved modelling of substructure within jets, AtlFast3 can simulate large numbers of events for a wide range of physics processes.
    Free, publicly-accessible full text available December 1, 2023
  4. Free, publicly-accessible full text available May 1, 2023
  5. Free, publicly-accessible full text available May 1, 2023
  6. Abstract The energy response of the ATLAS calorimeter is measured for single charged pions with transverse momentum in the range $$10more »response in the hadronic calorimeter are also compared between data and simulation.« less
    Free, publicly-accessible full text available March 1, 2023
  7. A bstract Searches are conducted for new spin-0 or spin-1 bosons using events where a Higgs boson with mass 125 GeV decays into four leptons ( ℓ = e , μ ). This decay is presumed to occur via an intermediate state which contains two on-shell, promptly decaying bosons: H → XX/ZX → 4 ℓ , where the new boson X has a mass between 1 and 60 GeV. The search uses pp collision data collected with the ATLAS detector at the LHC with an integrated luminosity of 139 fb − 1 at a centre-of-mass energy $$ \sqrt{s} $$ s = 13 TeV. The data are found to be consistent with Standard Model expectations. Limits are set on fiducial cross sections and on the branching ratio of the Higgs boson to decay into XX/ZX , improving those from previous publications by a factor between two and four. Limits are also set on mixing parameters relevant in extensions of the Standard Model containing a dark sector where X is interpreted to be a dark boson.
    Free, publicly-accessible full text available March 1, 2023