skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Haase, Andrew"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Recent approaches have shown promises distilling diffusion models into efficient one-step generators. Among them, Distribution Matching Distillation (DMD) produces one-step generators that match their teacher in distribution, without enforcing a one-to-one correspondence with the sampling trajectories of their teachers. However, to ensure stable training, DMD requires an additional regression loss computed using a large set of noise-image pairs generated by the teacher with many steps of a deterministic sampler. This is costly for large-scale text-to-image synthesis and limits the student's quality, tying it too closely to the teacher's original sampling paths. We introduce DMD2, a set of techniques that lift this limitation and improve DMD training. First, we eliminate the regression loss and the need for expensive dataset construction. We show that the resulting instability is due to the fake critic not estimating the distribution of generated samples accurately and propose a two time-scale update rule as a remedy. Second, we integrate a GAN loss into the distillation procedure, discriminating between generated samples and real images. This lets us train the student model on real data, mitigating the imperfect real score estimation from the teacher model, and enhancing quality. Lastly, we modify the training procedure to enable multi-step sampling. We identify and address the training-inference input mismatch problem in this setting, by simulating inference-time generator samples during training time. Taken together, our improvements set new benchmarks in one-step image generation, with FID scores of 1.28 on ImageNet-64x64 and 8.35 on zero-shot COCO 2014, surpassing the original teacher despite a 500X reduction in inference cost. Further, we show our approach can generate megapixel images by distilling SDXL, demonstrating exceptional visual quality among few-step methods. 
    more » « less
  2. Abstract Recent rapid thinning of West Antarctic ice shelves are believed to be caused by intrusions of warm deep water that induce basal melting and seaward meltwater export. This study uses data from three bottom-mounted mooring arrays to show seasonal variability and local forcing for the currents moving into and out of the Dotson ice shelf cavity. A southward flow of warm, salty water had maximum current velocities along the eastern channel slope, while northward outflows of freshened ice shelf meltwater spread at intermediate depth above the western slope. The inflow correlated with the local ocean surface stress curl. At the western slope, meltwater outflows followed the warm influx along the eastern slope with a ~2–3 month delay. Ocean circulation near Dotson Ice Shelf, affected by sea ice distribution and wind, appears to significantly control the inflow of warm water and subsequent ice shelf melting on seasonal time-scales. 
    more » « less
  3. Integrating renewable energy into the manufacturing facility is the ultimate key to realising carbon-neutral operations. Although many firms have taken various initiatives to reduce the carbon footprint of their facilities, there are few quantitative studies focused on cost analysis and supply reliability of integrating intermittent wind and solar power. This paper aims to fill this gap by addressing the following question: shall we adopt power purchase agreement (PPA) or onsite renewable generation to realise the eco-economic benefits? We tackle this complex decision-making problem by considering two regulatory options: government carbon incentives and utility pricing policy. A stochastic programming model is formulated to search for the optimal mix of onsite and offsite renewable power supply. The model is tested extensively in different regions under various climatic conditions. Three findings are obtained. First, in a long term onsite generation and PPA can avoid the price volatility in the spot or wholesale electricity market. Second, at locations where the wind speed is below 6 m/s, PPA at $70/MWh is preferred over onsite wind generation. Third, compared to PPA and wind generation, solar generation is not economically competitive unless the capacity cost is down below USD1.5 M per MW. 
    more » « less
  4. Phenology is the study of recurring events in nature and their relationships with climate. The word derives from the Greek phaínō ‘appear’ and logos ‘reason’, emphasizing the focus on observing events and understanding why they occur (Demarée and Rutishauser 2009). Phenological recording has a history that dates back many centuries (Linneaus and Bark 1753; Aono and Kazui 2008). More recently, advances in monitoring technologies have enabled automated and remotely sensed observations, complemented by increasing citizen science participation in monitoring efforts. Phenological information can also be derived from widespread environmental monitoring stations around the globe. 
    more » « less
  5. This paper presents a search for massive, charged, long-lived particles with the ATLAS detector at the Large Hadron Collider using an integrated luminosity of $$140~fb^{−1}$$ of proton-proton collisions at $$\sqrt{s}=13$$~TeV. These particles are expected to move significantly slower than the speed of light. In this paper, two signal regions provide complementary sensitivity. In one region, events are selected with at least one charged-particle track with high transverse momentum, large specific ionisation measured in the pixel detector, and time of flight to the hadronic calorimeter inconsistent with the speed of light. In the other region, events are selected with at least two tracks of opposite charge which both have a high transverse momentum and an anomalously large specific ionisation. The search is sensitive to particles with lifetimes greater than about 3 ns with masses ranging from 200 GeV to 3 TeV. The results are interpreted to set constraints on the supersymmetric pair production of long-lived R-hadrons, charginos and staus, with mass limits extending beyond those from previous searches in broad ranges of lifetime 
    more » « less
    Free, publicly-accessible full text available July 1, 2026
  6. This report presents a comprehensive collection of searches for new physics performed by the ATLAS Collaboration during the Run~2 period of data taking at the Large Hadron Collider, from 2015 to 2018, corresponding to about 140~$$^{-1}$$ of $$\sqrt{s}=13$$~TeV proton--proton collision data. These searches cover a variety of beyond-the-standard model topics such as dark matter candidates, new vector bosons, hidden-sector particles, leptoquarks, or vector-like quarks, among others. Searches for supersymmetric particles or extended Higgs sectors are explicitly excluded as these are the subject of separate reports by the Collaboration. For each topic, the most relevant searches are described, focusing on their importance and sensitivity and, when appropriate, highlighting the experimental techniques employed. In addition to the description of each analysis, complementary searches are compared, and the overall sensitivity of the ATLAS experiment to each type of new physics is discussed. Summary plots and statistical combinations of multiple searches are included whenever possible. 
    more » « less
    Free, publicly-accessible full text available April 22, 2026
  7. Top-quark pair production is observed in lead–lead ( Pb + Pb ) collisions at s NN = 5.02 TeV at the Large Hadron Collider with the ATLAS detector. The data sample was recorded in 2015 and 2018, amounting to an integrated luminosity of 1.9 nb 1 . Events with exactly one electron and one muon and at least two jets are selected. Top-quark pair production is measured with an observed (expected) significance of 5.0 (4.1) standard deviations. The measured top-quark pair production cross section is σ t t ¯ = 3.6 0.9 + 1.0 ( stat ) 0.5 + 0.8 ( syst ) μ b , with a total relative uncertainty of 31%, and is consistent with theoretical predictions using a range of different nuclear parton distribution functions. The observation of this process consolidates the evidence of the existence of all quark flavors in the preequilibrium stage of the quark-gluon plasma at very high energy densities, similar to the conditions present in the early Universe. © 2025 CERN, for the ATLAS Collaboration2025CERN 
    more » « less
    Free, publicly-accessible full text available April 1, 2026
  8. A<sc>bstract</sc> A study of the Higgs boson decaying into bottom quarks (H→$$ b\overline{b} $$ b b ¯ ) and charm quarks (H→$$ c\overline{c} $$ c c ¯ ) is performed, in the associated production channel of the Higgs boson with aWorZboson, using 140 fb−1of proton-proton collision data at$$ \sqrt{s} $$ s = 13 TeV collected by the ATLAS detector. The individual production ofWHandZHwithH→$$ b\overline{b} $$ b b ¯ is established with observed (expected) significances of 5.3 (5.5) and 4.9 (5.6) standard deviations, respectively. Differential cross-section measurements of the gauge boson transverse momentum within the simplified template cross-section framework are performed in a total of 13 kinematical fiducial regions. The search for theH→$$ c\overline{c} $$ c c ¯ decay yields an observed (expected) upper limit at 95% confidence level of 11.5 (10.6) times the Standard Model prediction. The results are also used to set constraints on the charm coupling modifier, resulting in|κc| <4.2 at 95% confidence level. Combining theH→$$ b\overline{b} $$ b b ¯ andH→$$ c\overline{c} $$ c c ¯ measurements constrains the absolute value of the ratio of Higgs-charm and Higgs-bottom coupling modifiers (|κcb|) to be less than 3.6 at 95% confidence level. 
    more » « less
    Free, publicly-accessible full text available April 1, 2026
  9. The ATLAS experiment has developed extensive software and distributed computing systems for Run 3 of the LHC. These systems are described in detail, including software infrastructure and workflows, distributed data and workload management, database infrastructure, and validation. The use of these systems to prepare the data for physics analysis and assess its quality are described, along with the software tools used for data analysis itself. An outlook for the development of these projects towards Run 4 is also provided. 
    more » « less
    Free, publicly-accessible full text available March 6, 2026