Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract Ferro‐rotational (FR) materials, renowned for their distinctive material functionalities, present challenges in the growth of homo‐FR crystals (i.e., single FR domain). This study explores a cost‐effective approach to growing homo‐FR helimagnetic RbFe(SO4)2(RFSO) crystals by lowering the crystal growth temperature below the
TFR threshold using the high‐pressure hydrothermal method. Through polarized neutron diffraction experiments, it is observed that nearly 86% of RFSO crystals consist of a homo‐FR domain. Notably, RFSO displays remarkable stability in the FR phase, with an exceptionally highTFR of ≈573 K. Furthermore, RFSO exhibits a chiral helical magnetic structure with switchable ferroelectric polarization below 4 K. Importantly, external electric fields can induce a single magnetic domain state and manipulate its magnetic chirality. The findings suggest that the search for new FR magnets with outstanding material properties should consider magnetic sulfates as promising candidates.Free, publicly-accessible full text available July 3, 2025 -
block2 is an open source framework to implement and perform density matrix renormalization group and matrix product state algorithms. Out-of-the-box it supports the eigenstate, time-dependent, response, and finite-temperature algorithms. In addition, it carries special optimizations for ab initio electronic structure Hamiltonians and implements many quantum chemistry extensions to the density matrix renormalization group, such as dynamical correlation theories. The code is designed with an emphasis on flexibility, extensibility, and efficiency and to support integration with external numerical packages. Here, we explain the design principles and currently supported features and present numerical examples in a range of applications.
-
Learning to optimize (L2O) has gained increasing popularity, which automates the design of optimizers by data-driven approaches. However, current L2O methods often suffer from poor generalization performance in at least two folds: (i) applying the L2O-learned optimizer to unseen optimizees, in terms of lowering their loss function values (optimizer generalization, or “generalizable learning of optimizers”); and (ii) the test performance of an optimizee (itself as a machine learning model), trained by the optimizer, in terms of the accuracy over unseen data (optimizee generalization, or “learning to generalize”). While the optimizer generalization has been recently studied, the optimizee generalization (or learning to generalize) has not been rigorously studied in the L2O context, which is the aim of this paper. We first theoretically establish an implicit connection between the local entropy and the Hessian, and hence unify their roles in the handcrafted design of generalizable optimizers as equivalent metrics of the landscape flatness of loss functions. We then propose to incorporate these two metrics as flatness-aware regularizers into the L2O framework in order to meta-train optimizers to learn to generalize, and theoretically show that such generalization ability can be learned during the L2O meta-training process and then transformed to the optimizee loss function. Extensive experiments consistently validate the effectiveness of our proposals with substantially improved generalization on multiple sophisticated L2O models and diverse optimizees.more » « less