skip to main content

Attention:

The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 11:00 PM ET on Thursday, October 10 until 2:00 AM ET on Friday, October 11 due to maintenance. We apologize for the inconvenience.


Search for: All records

Creators/Authors contains: "Xu, H."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Generating 3D graphs of symmetry-group equivariance is of intriguing potential in broad applications from machine vision to molecular discovery. Emerging approaches adopt diffusion generative models (DGMs) with proper re-engineering to capture 3D graph distributions. In this paper, we raise an orthogonal and fundamental question of in what (latent) space we should diffuse 3D graphs. ❶ We motivate the study with theoretical analysis showing that the performance bound of 3D graph diffusion can be improved in a latent space versus the original space, provided that the latent space is of (i) low dimensionality yet (ii) high quality (i.e., low reconstruction error) and DGMs have (iii) symmetry preservation as an inductive bias. ❷ Guided by the theoretical guidelines, we propose to perform 3D graph diffusion in a low-dimensional latent space, which is learned through cascaded 2D–3D graph autoencoders for low-error reconstruction and symmetry-group invariance. The overall pipeline is dubbed latent 3D graph diffusion. ❸ Motivated by applications in molecular discovery, we further extend latent 3D graph diffusion to conditional generation given SE(3)-invariant attributes or equivariant 3D objects. ❹ We also demonstrate empirically that out-of-distribution conditional generation can be further improved by regularizing the latent space via graph self-supervised learning. We validate through comprehensive experiments that our method generates 3D molecules of higher validity / drug-likeliness and comparable or better conformations / energetics, while being an order of magnitude faster in training. Codes are released at https://github.com/Shen-Lab/LDM-3DG. 
    more » « less
    Free, publicly-accessible full text available January 16, 2025
  2. Why the Challenger Deep, the deepest point on Earth’s solid surface, is so deep is unclear, but part of the reason must be the age and density of the downgoing plate. Northwest Pacific oceanic crust subducting in the Izu-Bonin-Mariana Trench is Cretaceous and Jurassic, but the age and nature of Pacific oceanic crust subducting in the southernmost Mariana Trench remains unknown. Here we present the first study of seafloor basalts recovered by the full-ocean-depth crewed submersible Fendouzhe from the deepest seafloor around the Challenger Deep, from both the overriding and downgoing plates. 40Ar/39Ar ages indicate that downgo¬ing basalts are Early Cretaceous (ca. 125 Ma), indicating they are part of the Pacific plate rather than the nearby Oligocene Caroline microplate. Downgoing-plate basalts are slightly enriched in incompatible elements but have similar trace element and Hf isotope compositions to other northwest Pacific mid-ocean ridge basalts (MORBs). They also have slightly enriched Sr-Nd-Pb isotope compositions like those of the Indian mantle domain. These features may have formed with contributions from plume-derived components via plume-ridge interac¬tions. One sample from the overriding plate gives an 40Ar/39Ar age of ca. 55 Ma, about the same age as subduction initiation, to form the Izu-Bonin-Mariana convergent margin. Our results suggest that 50%–90% of the Pb budget of Mariana arc magmas is derived from the subducted MORBs with Indian-type isotope affinity. 
    more » « less
  3. Over the past decade, a series of airborne experiments in the Arctic and Antarctica explored microwave emission from sea ice and ice sheets at frequencies from 0.5 to 2 GHz. The experiments were motivated by the fact that lower frequencies penetrate deeper into a frozen surface, thus offering the possibility to measure physical temperatures at great depths in ice sheets and, subsequently, other unique geophysical observables including sea ice salinity. These experiments were made feasible by recent engineering advances in electronics, antenna design, and noise removal algorithms when operating outside of protected bands in the electromagnetic spectrum. These technical advances permit a new type of radiometer that not only operates at low frequency, but also obtains continuous spectral information over the band from 0.5 to 2 GHz. Spectral measurements facilitate an understanding of the physical processes controlling emission and also support the interpretation of results from single frequency instruments. This paper reviews the development of low-frequency, wide band radiometry and its application to cryosphere science over the past 10 years. The paper summarizes the engineering design of an airborne instrument and the associated algorithms to mitigate radio frequency interference. Theoretical models of emission built around the morphologic and electrical properties of cryospheric components are also described that identify the dominant physical processes contributing to emission spectra. New inversion techniques for geophysical parameter retrieval are summarized for both Arctic and Antarctic scenarios. Examples that illustrate how the measurements are used to inform on glaciological problems are presented. The paper concludes with a description of new instrument concepts that are foreseen to extend the technology into operation from space. 
    more » « less
  4. A* is a classic and popular method for graphs search and path finding. It assumes the existence of a heuristic function h(u,t) that estimates the shortest distance from any input node u to the destination t. Traditionally, heuristics have been handcrafted by domain experts. However, over the last few years, there has been a growing interest in learning heuristic functions. Such learned heuristics estimate the distance between given nodes based on "features" of those nodes. In this paper we formalize and initiate the study of such feature-based heuristics. In particular, we consider heuristics induced by norm embeddings and distance labeling schemes, and provide lower bounds for the tradeoffs between the number of dimensions or bits used to represent each graph node, and the running time of the A* algorithm. We also show that, under natural assumptions, our lower bounds are almost optimal. 
    more » « less
  5. Abstract X-ray bursts are among the brightest stellar objects frequently observed in the sky by space-based telescopes. A type-I X-ray burst is understood as a violent thermonuclear explosion on the surface of a neutron star, accreting matter from a companion star in a binary system. The bursts are powered by a nuclear reaction sequence known as the rapid proton capture process (rp process), which involves hundreds of exotic neutron-deficient nuclides. At so-called waiting-point nuclides, the process stalls until a slower β + decay enables a bypass. One of the handful of rp process waiting-point nuclides is 64 Ge, which plays a decisive role in matter flow and therefore the produced X-ray flux. Here we report precision measurements of the masses of 63 Ge, 64,65 As and 66,67 Se—the relevant nuclear masses around the waiting-point 64 Ge—and use them as inputs for X-ray burst model calculations. We obtain the X-ray burst light curve to constrain the neutron-star compactness, and suggest that the distance to the X-ray burster GS 1826–24 needs to be increased by about 6.5% to match astronomical observations. The nucleosynthesis results affect the thermal structure of accreting neutron stars, which will subsequently modify the calculations of associated observables. 
    more » « less