skip to main content

Search for: All records

Creators/Authors contains: "Mao, Y"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available June 1, 2023
  2. Free, publicly-accessible full text available May 1, 2023
  3. ABSTRACT Obtaining accurately calibrated redshift distributions of photometric samples is one of the great challenges in photometric surveys like LSST, Euclid, HSC, KiDS, and DES. We present an inference methodology that combines the redshift information from the galaxy photometry with constraints from two-point functions, utilizing cross-correlations with spatially overlapping spectroscopic samples, and illustrate the approach on CosmoDC2 simulations. Our likelihood framework is designed to integrate directly into a typical large-scale structure and weak lensing analysis based on two-point functions. We discuss efficient and accurate inference techniques that allow us to scale the method to the large samples of galaxies tomore »be expected in LSST. We consider statistical challenges like the parametrization of redshift systematics, discuss and evaluate techniques to regularize the sample redshift distributions, and investigate techniques that can help to detect and calibrate sources of systematic error using posterior predictive checks. We evaluate and forecast photometric redshift performance using data from the CosmoDC2 simulations, within which we mimic a DESI-like spectroscopic calibration sample for cross-correlations. Using a combination of spatial cross-correlations and photometry, we show that we can provide calibration of the mean of the sample redshift distribution to an accuracy of at least 0.002(1 + z), consistent with the LSST-Y1 science requirements for weak lensing and large-scale structure probes.« less
    Free, publicly-accessible full text available December 9, 2022
  4. Abstract We use a recent census of the Milky Way (MW) satellite galaxy population to constrain the lifetime of particle dark matter (DM). We consider two-body decaying dark matter (DDM) in which a heavy DM particle decays with lifetime τ comparable to the age of the universe to a lighter DM particle (with mass splitting ϵ ) and to a dark radiation species. These decays impart a characteristic “kick velocity,” V kick = ϵ c , on the DM daughter particles, significantly depleting the DM content of low-mass subhalos and making them more susceptible to tidal disruption. We fit themore »suppression of the present-day DDM subhalo mass function (SHMF) as a function of τ and V kick using a suite of high-resolution zoom-in simulations of MW-mass halos, and we validate this model on new DDM simulations of systems specifically chosen to resemble the MW. We implement our DDM SHMF predictions in a forward model that incorporates inhomogeneities in the spatial distribution and detectability of MW satellites and uncertainties in the mapping between galaxies and DM halos, the properties of the MW system, and the disruption of subhalos by the MW disk using an empirical model for the galaxy–halo connection. By comparing to the observed MW satellite population, we conservatively exclude DDM models with τ < 18 Gyr (29 Gyr) for V kick = 20 kms −1 (40 kms −1 ) at 95% confidence. These constraints are among the most stringent and robust small-scale structure limits on the DM particle lifetime and strongly disfavor DDM models that have been proposed to alleviate the Hubble and S 8 tensions.« less
    Free, publicly-accessible full text available June 1, 2023
  5. Abstract: Modeling student learning processes is highly complex since it is influenced by many factors such as motivation and learning habits. The high volume of features and tools provided by computer-based learning environments confounds the task of tracking student knowledge even further. Deep Learning models such as Long-Short Term Memory (LSTMs) and classic Markovian models such as Bayesian Knowledge Tracing (BKT) have been successfully applied for student modeling. However, much of this prior work is designed to handle sequences of events with discrete timesteps, rather than considering the continuous aspect of time. Given that time elapsed between successive elements inmore »a student’s trajectory can vary from seconds to days, we applied a Timeaware LSTM (T-LSTM) to model the dynamics of student knowledge state in continuous time. We investigate the effectiveness of T-LSTM on two domains with very different characteristics. One involves an open-ended programming environment where students can self-pace their progress and T-LSTM is compared against LSTM, Recent Temporal Pattern Mining, and the classic Logistic Regression (LR) on the early prediction of student success; the other involves a classic tutor-driven intelligent tutoring system where the tutor scaffolds the student learning step by step and T-LSTM is compared with LSTM, LR, and BKT on the early prediction of student learning gains. Our results show that TLSTM significantly outperforms the other methods on the self-paced, open-ended programming environment; while on the tutor-driven ITS, it ties with LSTM and outperforms both LR and BKT. In other words, while time-irregularity exists in both datasets, T-LSTM works significantly better than other student models when the pace is driven by students. On the other hand, when such irregularity results from the tutor, T-LSTM was not superior to other models but its performance was not hurt either.« less
  6. Early prediction of student difficulty during long-duration learning activities allows a tutoring system to intervene by providing needed support, such as a hint, or by alerting an instructor. To be e effective, these predictions must come early and be highly accurate, but such predictions are difficult for open-ended programming problems. In this work, Recent Temporal Patterns (RTPs) are used in conjunction with Support Vector Machine and Logistic Regression to build robust yet interpretable models for early predictions. We performed two tasks: to predict student success and difficulty during one, open-ended novice programming task of drawing a square-shaped spiral. We comparedmore »RTP against several machine learning models ranging from the classic to the more recent deep learning models such as Long Short Term Memory to predict whether students would be able to complete the programming task. Our results show that RTP-based models outperformed all others, and could successfully classify students after just one minute of a 20- minute exercise (students can spend more than 1 hour on it). To determine when a system might intervene to prevent incompleteness or eventual dropout, we applied RTP at regular intervals to predict whether a student would make progress within the next fi ve minutes, reflecting that they may be having difficulty. RTP successfully classifi ed these students needing interventions over 85% of the time, with increased accuracy using data-driven program features. These results contribute signi ficantly to the potential to build a fully data-driven tutoring system for novice programming.« less