skip to main content


Search for: All records

Creators/Authors contains: "Liu, L"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Molecular clocks are the basis for dating the divergence between lineages over macroevolutionary timescales (~105to 108years). However, classical DNA-based clocks tick too slowly to inform us about the recent past. Here, we demonstrate that stochastic DNA methylation changes at a subset of cytosines in plant genomes display a clocklike behavior. This “epimutation clock” is orders of magnitude faster than DNA-based clocks and enables phylogenetic explorations on a scale of years to centuries. We show experimentally that epimutation clocks recapitulate known topologies and branching times of intraspecies phylogenetic trees in the self-fertilizing plantArabidopsis thalianaand the clonal seagrassZostera marina, which represent two major modes of plant reproduction. This discovery will open new possibilities for high-resolution temporal studies of plant biodiversity.

     
    more » « less
    Free, publicly-accessible full text available September 29, 2024
  2. There is great interest in “end-to-end” analysis that captures how innovation at the materials, device, and/or archi-tectural levels will impact figures of merit at the application-level. However, there are numerous combinations of devices and architectures to study, and we must establish systematic ways to accurately explore and cull a vast design space. We aim to capture how innovations at the materials/device-level may ultimately impact figures of merit associated with both existing and emerging technologies that may be employed for either logic and/or memory. We will highlight how collaborations with researchers at these levels of the design hierarchy - as well as efforts to help construct well-calibrated device models - can in-turn support architectural design space explorations that will help to identify the most promising ways to use new technologies to support application-level workloads of interest. For given compute workloads, we can then quantitatively assess the potential benefits of technology-driven architectures to identify the most promising paths forward. Because of the large number of potentially interesting device-architecture combinations, it is of the utmost importance to develop well-calibrated analytical modeling tools to more rapidly assess the potential value of a given (likely heterogeneous) solution. We highlight recent efforts and needs in this space. 
    more » « less
  3. Answer set programming (ASP) has long been used for modeling and solving hard search problems. Experience shows that the performance of ASP tools on different ASP encodings of the same problem may vary greatly from instance to instance and it is rarely the case that one encoding outperforms all others. We describe a system and its implementation that given a set of encodings and a training set of instances, builds performance models for the encodings, predicts the execution time of these encodings on new instances, and uses these predictions to select an encoding for solving. 
    more » « less
  4. null (Ed.)
  5. null (Ed.)
    The goal of compressed sensing is to estimate a high dimensional vector from an underdetermined system of noisy linear equations. In analogy to classical compressed sensing, here we assume a generative model as a prior, that is, we assume the vector is represented by a deep generative model G:Rk→Rn. Classical recovery approaches such as empirical risk minimization (ERM) are guaranteed to succeed when the measurement matrix is sub-Gaussian. However, when the measurement matrix and measurements are heavy-tailed or have outliers, recovery may fail dramatically. In this paper we propose an algorithm inspired by the Median-of-Means (MOM). Our algorithm guarantees recovery for heavy-tailed data, even in the presence of outliers. Theoretically, our results show our novel MOM-based algorithm enjoys the same sample complexity guarantees as ERM under sub-Gaussian assumptions. Our experiments validate both aspects of our claims: other algorithms are indeed fragile and fail under heavy-tailed and/or corrupted data, while our approach exhibits the predicted robustness. 
    more » « less
  6. null (Ed.)