skip to main content

Search for: All records

Creators/Authors contains: "Ma, Y."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Boeri, L. ; Hennig, R. ; Hirschfeld, P. ; Profeta, G. ; Sanna, A. ; Zurek, E. (Ed.)
    Last year, the report of Room-Temperature Superconductivity in high-pressure carbonaceous sulfur hydride marked a major milestone in the history of physics: one of the holy grails of condensed matter research was reached after more than one century of continuing efforts. This long path started with Neil Ashcroft’s and Vitaly Ginzburg’s visionary insights on high-temperature superconductivity in metallic hydrogen in the 60’s and 70’s, and has led to the current hydride fever, following the report of high-Tc high-pressure superconductivity in H3S in 2014. This Roadmap collects selected contributions from many of the main actors in this exciting chapter of condensed mattermore »history. Key for the rapid progress of this field has been a new course for materials discovery, where experimental and theoretical discoveries proceed hand in hand. The aim of this Roadmap is not only to offer a snapshot of the current status of superconductor materials research, but also to define the theoretical and experimental obstacles that must be overcome for us to realize fully exploitable room temperature superconductors, and foresee future strategies and research directions. This means improving synthesis techniques, extending first-principles methods for superconductors and structural search algorithms for crystal structure predictions, but also identifying new approaches to material discovery based on artificial intelligence.« less
    Free, publicly-accessible full text available October 1, 2022
  2. Motivation: The question of what combination of attributes drives the adoption of a particular software technology is critical to developers. It determines both those technologies that receive wide support from the community and those which may be abandoned, thus rendering developers' investments worthless. Aim and Context: We model software technology adoption by developers and provide insights on specific technology attributes that are associated with better visibility among alternative technologies. Thus, our findings have practical value for developers seeking to increase the adoption rate of their products. Approach: We leverage social contagion theory and statistical modeling to identify, define, and testmore »empirically measures that are likely to affect software adoption. More specifically, we leverage a large collection of open source repositories to construct a software dependency chain for a specific set of R language source-code files. We formulate logistic regression models, where developers' software library choices are modeled, to investigate the combination of technological attributes that drive adoption among competing data frame (a core concept for a data science languages) implementations in the R language: tidy and data.table. To describe each technology, we quantify key project attributes that might affect adoption (e.g., response times to raised issues, overall deployments, number of open defects, knowledge base) and also characteristics of developers making the selection (performance needs, scale, and their social network). Results: We find that a quick response to raised issues, a larger number of overall deployments, and a larger number of high-score StackExchange questions are associated with higher adoption. Decision makers tend to adopt the technology that is closer to them in the technical dependency network and in author collaborations networks while meeting their performance needs. To gauge the generalizability of the proposed methodology, we investigate the spread of two popular web JavaScript frameworks Angular and React, and discuss the results. Future work: We hope that our methodology encompassing social contagion that captures both rational and irrational preferences and the elucidation of key measures from large collections of version control data provides a general path toward increasing visibility, driving better informed decisions, and producing more sustainable and widely adopted software.« less
  3. Motivated by the many real-world applications of reinforcement learning (RL) that require safe-policy iterations, we consider the problem of off-policy evaluation (OPE) — the problem of evaluating a new policy using the historical data ob- tained by different behavior policies — under the model of nonstationary episodic Markov Decision Processes (MDP) with a long horizon and a large action space. Existing importance sampling (IS) methods often suffer from large variance that depends exponentially on the RL horizon H. To solve this problem, we consider a marginalized importance sampling (MIS) estimator that recursively estimates the state marginal distribution for the targetmore »policy at every step. MIS achieves a mean-squared error of [ ] where μ and π are the logging and target policies, dμt (st) and dπt (st) are the marginal distribution of the state at tth step, H is the horizon, n is the sample size and V π is the value function of the MDP under π. The result matches the t+1 Cramer-Rao lower bound in Jiang and Li [2016] up to a multiplicative factor of H. To the best of our knowledge, this is the first OPE estimation error bound with a polynomial dependence on H . Besides theory, we show empirical superiority of our method in time-varying, partially observable, and long-horizon RL environments.« less
  4. Abstract

    In the field of beam physics, two frontier topics have taken center stage due to their potential to enable new approaches to discovery in a wide swath of science. These areas are: advanced, high gradient acceleration techniques, and x-ray free electron lasers (XFELs). Further, there is intense interest in the marriage of these two fields, with the goal of producing a very compact XFEL. In this context, recent advances in high gradient radio-frequency cryogenic copper structure research have opened the door to the use of surface electric fields between 250 and 500 MV m−1. Such an approach is foreseenmore »to enable a new generation of photoinjectors with six-dimensional beam brightness beyond the current state-of-the-art by well over an order of magnitude. This advance is an essential ingredient enabling an ultra-compact XFEL (UC-XFEL). In addition, one may accelerate these bright beams to GeV scale in less than 10 m. Such an injector, when combined with inverse free electron laser-based bunching techniques can produce multi-kA beams with unprecedented beam quality, quantified by 50 nm-rad normalized emittances. The emittance, we note, is the effective area in transverse phase space (x,px/mec) or (y,py/mec) occupied by the beam distribution, and it is relevant to achievable beam sizes as well as setting a limit on FEL wavelength. These beams, when injected into innovative, short-period (1–10 mm) undulators uniquely enable UC-XFELs having footprints consistent with university-scale laboratories. We describe the architecture and predicted performance of this novel light source, which promises photon production per pulse of a few percent of existing XFEL sources. We review implementation issues including collective beam effects, compact x-ray optics systems, and other relevant technical challenges. To illustrate the potential of such a light source to fundamentally change the current paradigm of XFELs with their limited access, we examine possible applications in biology, chemistry, materials, atomic physics, industry, and medicine—including the imaging of virus particles—which may profit from this new model of performing XFEL science.

    « less
  5. Free, publicly-accessible full text available April 1, 2022