Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available September 22, 2026
-
Within the geo-simulation research domain, micro-simulation and agent-based modeling often require the creation of synthetic populations. Creating such data is a time-consuming task and often lacks social networks, which are crucial for studying human interactions (e.g., disease spread, disaster response) while at the same time impacting decision-making. We address these challenges by introducing a Python based method that uses the open data including that from 2020 U.S. Census data to generate a large-scale realistic geographically explicit synthetic population for America’s 50 states and Washington D.C. along with the stylized social networks (e.g., home, work and schools). The resulting synthetic population can be utilized within various geo-simulation approaches (e.g., agent-based modeling), exploring the emergence of complex phenomena through human interactions and further fostering the study of urban digital twins.more » « lessFree, publicly-accessible full text available December 1, 2025
-
Incremental learning is a challenging task in the field of machine learning, and it is a key step towards autonomous learning and adaptation. With the increasing attention on neuromorphic computing, there is an urgent need to investigate incremental learning techniques that can work in this paradigm to maintain energy efficiency while benefiting from flexibility and adaptability. In this paper, we present SEMINAR (sensitivity modulated importance networking and rehearsal), an incremental learning algorithm designed specifically for EMSTDP (Error Modulated Synaptic-Timing Dependent Plasticity), which performs supervised learning for multi-layer spiking neural networks (SNN) implemented on neuromorphic hardware, such as Loihi. SEMINAR uses critical synapse selection, differential learning rate and a replay buffer to enable the model to retain past knowledge while maintaining flexibility to learn new tasks. Our experimental results show that, when combined with the EMSTDP, SEMINAR outperforms different baseline incremental learning algorithms and gives more than 4% improvement on several widely used datasets such as Split-MNIST, Split-Fashion MNIST, Split-NMNIST and MSTAR.more » « less
-
Abstract Ferroptosis has been shown to play a crucial role in preventing cancer development, but the underlying mechanisms of dysregulated genes and genetic alternations driving cancer development by regulating ferroptosis remain unclear. Here, we showed that the synergistic role of ELF3 overexpression and PTEN deficiency in driving lung cancer development was highly dependent on the regulation of ferroptosis. HumanELF3(hELF3) overexpression in murine lung epithelial cells only caused hyperplasia with increased proliferation and ferroptosis. hELF3overexpression andPtengenetic disruption significantly induced lung tumor development with increased proliferation and inhibited ferroptosis. Mechanistically, we found it was due to the induction of SCL7A11, a typical ferroptosis inhibitor, and ELF3 directly and positively regulated SCL7A11 in the PTEN-deficient background. Erastin-mediated inhibition of SCL7A11 induced ferroptosis in cells with ELF3 overexpression and PTEN deficiency and thus inhibited cell colony formation and tumor development. Clinically, human lung tumors showed a negative correlation betweenELF3andPTENexpression and a positive correlation betweenELF3andSCL7A11in a subset of human lung tumors withPTEN-low expression.ELF3andSCL7A11expression levels were negatively associated with lung cancer patients’ survival rates. In summary, ferroptosis induction can effectively attenuate lung tumor development induced byELF3overexpression andPTENdownregulation or loss-of-function mutations.more » « lessFree, publicly-accessible full text available December 1, 2025
-
Deep neural networks are able to memorize noisy labels easily with a softmax cross entropy (CE) loss. Previous studies attempted to address this issue focus on incorporating a noise-robust loss function to the CE loss. However, the memorization issue is alleviated but still remains due to the non-robust CE loss. To address this issue, we focus on learning robust contrastive representations of data on which the classifier is hard to memorize the label noise under the CE loss. We propose a novel contrastive regularization function to learn such representations over noisy data where the label noise does not dominate the representation learning. By theoretically investigating the representations induced by the proposed regularization function, we reveal that the learned representations keep information related to true labels and discard information related to corrupted labels from images. Moreover, our theoretical results also indicate that the learned representations are robust to the label noise. Experiments on benchmark datasets demonstrate that the efficacy of our method.more » « less
An official website of the United States government
