skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Harnessing deep neural networks to solve inverse problems in quantum dynamics: machine-learned predictions of time-dependent optimal control fields
Inverse problems continue to garner immense interest in the physical sciences, particularly in the context of controlling desired phenomena in non-equilibrium systems. In this work, we utilize a series of deep neural networks for predicting time-dependent optimal control fields, E ( t ), that enable desired electronic transitions in reduced-dimensional quantum dynamical systems. To solve this inverse problem, we investigated two independent machine learning approaches: (1) a feedforward neural network for predicting the frequency and amplitude content of the power spectrum in the frequency domain ( i.e. , the Fourier transform of E ( t )), and (2) a cross-correlation neural network approach for directly predicting E ( t ) in the time domain. Both of these machine learning methods give complementary approaches for probing the underlying quantum dynamics and also exhibit impressive performance in accurately predicting both the frequency and strength of the optimal control field. We provide detailed architectures and hyperparameters for these deep neural networks as well as performance metrics for each of our machine-learned models. From these results, we show that machine learning, particularly deep neural networks, can be employed as cost-effective statistical approaches for designing electromagnetic fields to enable desired transitions in these quantum dynamical systems.  more » « less
Award ID(s):
1833218 1808242
PAR ID:
10198696
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Physical Chemistry Chemical Physics
Volume:
22
Issue:
40
ISSN:
1463-9076
Page Range / eLocation ID:
22889 to 22899
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Cellular materials have been widely used in load carrying lightweight structures. Although lightweight increases natural frequency, low stiffness of cellular structures reduces natural frequency. Designing structures with higher natural frequency can usually avoid resonance. In addition, because of the less amount of materials used in cellular structures, the energy absorption capability usually decreases such as under impact loading. Therefore, designing cellular structures with higher natural frequency and higher energy absorption capability is highly desired. In this study, machine learning and novel inverse design techniques enable to search a huge space of unexplored structural designs. In this study, machine learning regression and Generative Neural Networks (GANs) were used to form an inverse design framework. Optimal cellular unit cells that surpass the performance of biomimetic structures inspired from honeycomb, plant stems and trabecular bone in terms of natural frequency and impact resistance were discovered using machine learning. The discovered optimal cellular unit cells exhibited 30–100% higher natural frequency and 300% higher energy absorption than those of the biomimetic counterparts. The discovered optimal unit cells were validated through experimental and simulation comparisons. The machine learning framework in this study would help in designing load carrying engineering structures with increased natural frequency and enhanced energy absorption capability. 
    more » « less
  2. null (Ed.)
    The recent striking success of deep neural networks in machine learning raises profound questions about the theoretical principles underlying their success. For example, what can such deep networks compute? How can we train them? How does information propagate through them? Why can they generalize? And how can we teach them to imagine? We review recent work in which methods of physical analysis rooted in statistical mechanics have begun to provide conceptual insights into these questions. These insights yield connections between deep learning and diverse physical and mathematical topics, including random landscapes, spin glasses, jamming, dynamical phase transitions, chaos, Riemannian geometry, random matrix theory, free probability, and nonequilibrium statistical mechanics. Indeed, the fields of statistical mechanics and machine learning have long enjoyed a rich history of strongly coupled interactions, and recent advances at the intersection of statistical mechanics and deep learning suggest these interactions will only deepen going forward. 
    more » « less
  3. null (Ed.)
    Grid cells in the brain fire in strikingly regular hexagonal patterns across space. There are currently two seemingly unrelated frameworks for understanding these patterns. Mechanistic models account for hexagonal firing fields as the result of pattern-forming dynamics in a recurrent neural network with hand-tuned center-surround connectivity. Normative models specify a neural architecture, a learning rule, and a navigational task, and observe that grid-like firing fields emerge due to the constraints of solving this task. Here we provide an analytic theory that unifies the two perspectives by casting the learning dynamics of neural networks trained on navigational tasks as a pattern forming dynamical system. This theory provides insight into the optimal solutions of diverse formulations of the normative task, and shows that symmetries in the representation of space correctly predict the structure of learned firing fields in trained neural networks. Further, our theory proves that a nonnegativity constraint on firing rates induces a symmetry-breaking mechanism which favors hexagonal firing fields. We extend this theory to the case of learning multiple grid maps and demonstrate that optimal solutions consist of a hierarchy of maps with increasing length scales. These results unify previous accounts of grid cell firing and provide a novel framework for predicting the learned representations of recurrent neural networks. 
    more » « less
  4. Abstract Machine learning, as a study of algorithms that automate prediction and decision‐making based on complex data, has become one of the most effective tools in the study of artificial intelligence. In recent years, scientific communities have been gradually merging data‐driven approaches with research, enabling dramatic progress in revealing underlying mechanisms, predicting essential properties, and discovering unconventional phenomena. It is becoming an indispensable tool in the fields of, for instance, quantum physics, organic chemistry, and medical imaging. Very recently, machine learning has been adopted in the research of photonics and optics as an alternative approach to address the inverse design problem. In this report, the fast advances of machine‐learning‐enabled photonic design strategies in the past few years are summarized. In particular, deep learning methods, a subset of machine learning algorithms, dealing with intractable high degrees‐of‐freedom structure design are focused upon. 
    more » « less
  5. null (Ed.)
    Abstract Automated inverse design methods are critical to the development of metamaterial systems that exhibit special user-demanded properties. While machine learning approaches represent an emerging paradigm in the design of metamaterial structures, the ability to retrieve inverse designs on-demand remains lacking. Such an ability can be useful in accelerating optimization-based inverse design processes. This paper develops an inverse design framework that provides this capability through the novel usage of invertible neural networks (INNs). We exploit an INN architecture that can be trained to perform forward prediction over a set of high-fidelity samples and automatically learns the reverse mapping with guaranteed invertibility. We apply this INN for modeling the frequency response of periodic and aperiodic phononic structures, with the performance demonstrated on vibration suppression of drill pipes. Training and testing samples are generated by employing a transfer matrix method. The INN models provide competitive forward and inverse prediction performance compared to typical deep neural networks (DNNs). These INN models are used to retrieve approximate inverse designs for a queried non-resonant frequency range; the inverse designs are then used to initialize a constrained gradient-based optimization process to find a more accurate inverse design that also minimizes mass. The INN-initialized optimizations are found to be generally superior in terms of the queried property and mass compared to randomly initialized and inverse DNN-initialized optimizations. Particle swarm optimization with INN-derived initial points is then found to provide even better solutions, especially for the higher-dimensional aperiodic structures. 
    more » « less