skip to main content


Search for: All records

Creators/Authors contains: "Lin, Sen"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available October 1, 2024
  2. Free, publicly-accessible full text available October 1, 2024
  3. Free, publicly-accessible full text available May 1, 2024
  4. By learning a sequence of tasks continually, an agent in continual learning (CL) can improve the learning performance of both a new task and `old' tasks by leveraging the forward knowledge transfer and the backward knowledge transfer, respectively. However, most existing CL methods focus on addressing catastrophic forgetting in neural networks by minimizing the modification of the learnt model for old tasks. This inevitably limits the backward knowledge transfer from the new task to the old tasks, because judicious model updates could possibly improve the learning performance of the old tasks as well. To tackle this problem, we first theoretically analyze the conditions under which updating the learnt model of old tasks could be beneficial for CL and also lead to backward knowledge transfer, based on the gradient projection onto the input subspaces of old tasks. Building on the theoretical analysis, we next develop a ContinUal learning method with Backward knowlEdge tRansfer (CUBER), for a fixed capacity neural network without data replay. In particular, CUBER first characterizes the task correlation to identify the positively correlated old tasks in a layer-wise manner, and then selectively modifies the learnt model of the old tasks when learning the new task. Experimental studies show that CUBER can even achieve positive backward knowledge transfer on several existing CL benchmarks for the first time without data replay, where the related baselines still suffer from catastrophic forgetting (negative backward knowledge transfer). The superior performance of CUBER on the backward knowledge transfer also leads to higher accuracy accordingly. 
    more » « less
  5. Frustrated Lewis pair (FLP) catalysts have attracted much recent interest because of their exceptional ability to activate small molecules in homogeneous catalysis. In the past ten years, this unique catalysis concept has been extended to heterogeneous catalysis, with much success. Herein, we review the recent theoretical advances in understanding FLP-based heterogeneous catalysis in several applications, including metal oxides, functionalized surfaces, and two-dimensional materials. A better understanding of the details of the catalytic mechanism can help in the experimental design of novel heterogeneous FLP catalysts. 
    more » « less