skip to main content


The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 11:00 PM ET on Friday, September 29 until 11:59 PM ET on Saturday, September 30 due to maintenance. We apologize for the inconvenience.

Search for: All records

Creators/Authors contains: "Wang, Xueying"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available May 1, 2024
  2. Drug-resistant HIV-1 has caused a growing concern in clinic and public health. Although combination antiretroviral therapy can contribute massively to the suppression of viral loads in patients with HIV-1, it cannot lead to viral eradication. Continuing viral replication during sub-optimal therapy (due to poor adherence or other reasons) may lead to the accumulation of drug resistance mutations, resulting in an increased risk of disease progression. Many studies also suggest that events occurring during the early stage of HIV-1 infection (i.e., the first few hours to days following HIV exposure) may determine whether the infection can be successfully established. However, the numbers of infected cells and viruses during the early stage are extremely low and stochasticity may play a critical role in dictating the fate of infection. In this paper, we use stochastic models to investigate viral infection and the emergence of drug resistance of HIV-1. The stochastic model is formulated by a continuous-time Markov chain (CTMC), which is derived based on an ordinary differential equation model proposed by Kitayimbwa et al. that includes both forward and backward mutations. An analytic estimate of the probability of the clearance of HIV infection of the CTMC model near the infection-free equilibrium is obtained by a multitype branching process approximation. The analytical predictions are validated by numerical simulations. Unlike the deterministic dynamics where the basic reproduction number $ \mathcal{R}_0 $ serves as a sharp threshold parameter (i.e., the disease dies out if $ \mathcal{R}_0 < 1 $ and persists if $ \mathcal{R}_0 > 1 $), the stochastic models indicate that there is always a positive probability for HIV infection to be eradicated in patients. In the presence of antiretroviral therapy, our results show that the chance of clearance of the infection tends to increase although drug resistance is likely to emerge.

    more » « less
  3. null (Ed.)
  4. The task of temporal slot filling (TSF) is to extract values of specific attributes for a given entity, called “facts”, as well as temporal tags of the facts, from text data. While existing work denoted the temporal tags as single time slots, in this paper, we introduce and study the task of Precise TSF (PTSF), that is to fill two precise temporal slots including the beginning and ending time points. Based on our observation from a news corpus, most of the facts should have the two points, however, fewer than 0.1% of them have time expressions in the documents. On the other hand, the documents’ post time, though often available, is not as precise as the time expressions of being the time a fact was valid. Therefore, directly decomposing the time expressions or using an arbitrary post-time period cannot provide accurate results for PTSF. The challenge of PTSF lies in finding precise time tags in noisy and incomplete temporal contexts in the text. To address the challenge, we propose an unsupervised approach based on the philosophy of truth finding. The approach has two modules that mutually enhance each other: One is a reliability estimator of fact extractors conditionally on the temporal contexts; the other is a fact trustworthiness estimator based on the extractor’s reliability. Commonsense knowledge (e.g., one country has only one president at a specific time) was automatically generated from data and used for inferring false claims based on trustworthy facts. For the purpose of evaluation, we manually collect hundreds of temporal facts from Wikipedia as ground truth, including country’s presidential terms and sport team’s player career history. Experiments on a large news dataset demonstrate the accuracy and efficiency of our proposed algorithm. 
    more » « less
  5. Reconstructed terabyte and petabyte electron microscopy image volumes contain fully-segmented neurons at resolutions fine enough to identify every synaptic connection. After manual or automatic reconstruction, neuroscientists want to extract wiring diagrams and connectivity information to analyze the data at a higher level. Despite significant advances in image acquisition, neuron segmentation, and synapse detection techniques, the extracted wiring diagrams are still quite coarse, and often do not take into account the wealth of information in the densely reconstructed volumes. We propose a synapse-aware skeleton generation strategy to transform the reconstructed volumes into an information-rich yet abstract format on which neuroscientists can perform biological analysis and run simulations. Our method extends existing topological thinning strategies and guarantees a one-to-one correspondence between skeleton endpoints and synapses while simultaneously generating vital geometric statistics on the neuronal processes. We demonstrate our results on three large-scale connectomic datasets and compare against current state-of-the-art skeletonization algorithms. 
    more » « less