skip to main content


Search for: All records

Creators/Authors contains: "Manning, C"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract Ocean sediments consist mainly of calcium carbonate and organic matter (phytoplankton debris). Once subducted, some carbon is removed from the slab and returns to the atmosphere as CO 2 in arc magmas. Its isotopic signature is thought to reflect the bulk fraction of inorganic (carbonate) and organic (graphitic) carbon in the sedimentary source. Here we challenge this assumption by experimentally investigating model sediments composed of 13 C-CaCO 3  +  12 C-graphite interacting with water at pressure, temperature and redox conditions of an average slab–mantle interface beneath arcs. We show that oxidative dissolution of graphite is the main process controlling the production of CO 2 , and its isotopic composition reflects the CO 2 /CaCO 3 rather than the bulk graphite/CaCO 3 (i.e., organic/inorganic carbon) fraction. We provide a mathematical model to relate the arc CO 2 isotopic signature with the fluid–rock ratios and the redox state in force in its subarc source. 
    more » « less
  2. Answering complex questions about textual narratives requires reasoning over both stated context and the world knowledge that underlies it. However, pretrained language models (LM), the foundation of most modern QA systems, do not robustly represent latent relationships between concepts, which is necessary for reasoning. While knowledge graphs (KG) are often used to augment LMs with structured representations of world knowledge, it remains an open question how to effectively fuse and reason over the KG representations and the language context, which provides situational constraints and nuances. In this work, we propose GreaseLM, a new model that fuses encoded representations from pretrained LMs and graph neural networks over multiple layers of modality interaction operations. Information from both modalities propagates to the other, allowing language context representations to be grounded by structured world knowledge, and allowing linguistic nuances (e.g., negation, hedging) in the context to inform the graph representations of knowledge. Our results on three benchmarks in the commonsense reasoning (i.e., CommonsenseQA, OpenbookQA) and medical question answering (i.e., MedQA-USMLE) domains demonstrate that GreaseLM can more reliably answer questions that require reasoning over both situational constraints and structured knowledge, even outperforming models 8x larger. 
    more » « less