skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Li, Ziwei"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available December 12, 2025
  2. Abstract Task‐incremental learning (Task‐IL) aims to enable an intelligent agent to continuously accumulate knowledge from new learning tasks without catastrophically forgetting what it has learned in the past. It has drawn increasing attention in recent years, with many algorithms being proposed to mitigate neural network forgetting. However, none of the existing strategies is able to completely eliminate the issues. Moreover, explaining and fully understanding what knowledge and how it is being forgotten during the incremental learning process still remains under‐explored. In this paper, we propose KnowledgeDrift, a visual analytics framework, to interpret the network forgetting with three objectives: (1) to identify when the network fails to memorize the past knowledge, (2) to visualize what information has been forgotten, and (3) to diagnose how knowledge attained in the new model interferes with the one learned in the past. Our analytical framework first identifies the occurrence of forgetting by tracking the task performance under the incremental learning process and then provides in‐depth inspections of drifted information via various levels of data granularity. KnowledgeDrift allows analysts and model developers to enhance their understanding of network forgetting and compare the performance of different incremental learning algorithms. Three case studies are conducted in the paper to further provide insights and guidance for users to effectively diagnose catastrophic forgetting over time. 
    more » « less
  3. null (Ed.)
    Abstract Precipitation extremes intensify in most regions in climate model projections. Changes in vertical velocities contribute to the changes in intensity of precipitation extremes but remain poorly understood. Here, we find that midtropospheric vertical velocities in extratropical precipitation extremes strengthen overall in simulations of twenty-first-century climate change. For each extreme event, we solve the quasigeostrophic omega equation to decompose this strengthening into different physical contributions. We first consider a dry decomposition in which latent heating is treated as an external forcing of upward motion. Much of the positive contribution to upward motion from increased latent heating is offset by negative contributions from increases in dry static stability and changes in the horizontal length scale of vertical velocities. However, taking changes in latent heating as given is a limitation when the aim is to understand changes in precipitation, since latent heating and precipitation are closely linked. Therefore, we also perform a moist decomposition of the changes in vertical velocities in which latent heating is represented through a moist static stability. In the moist decomposition, changes in moist static stability play a key role and contributions from other factors such as changes in the depth of the upward motion increase in importance. While both dry and moist decompositions are self-consistent, the moist dynamical perspective has greater potential to give insights into the causes of the dynamical contributions to changes in precipitation extremes in different regions. 
    more » « less