Reliable computational methodologies and basis sets for modeling x-ray spectra are essential for extracting and interpreting electronic and structural information from experimental x-ray spectra. In particular, the trade-off between numerical accuracy and computational cost due to the size of the basis set is a major challenge, since molecular orbitals undergo extreme relaxation in the core-hole state. To gain clarity on the changes in electronic structure induced by the formation of a core-hole, the use of sufficiently flexible basis for expanding the orbitals, particularly for the core region, has been shown to be essential. This work focuses on the refinement of core-hole ionized state calculations using the equation-of-motion coupled cluster family of methods through an extensive analysis on the effectiveness of “hybrid” and mixed basis sets. In this investigation, we utilize the CVS-EOMIP-CCSD method in combination and construct hybrid basis sets piecewise from readily available Dunning’s correlation consistent basis sets in order to calculate x-ray ionization energies (IEs) for a set of small gas phase molecules. Our results provide insights into the impact of basis sets on the CVS-EOMIP-CCSD calculations of K-edge IEs of first-row p-block elements. These insights enable us to understand more about the basis set dependence of the core IEs computed and allow us to establish a protocol for deriving reliable and cost-effective theoretical estimates for computing IEs of small molecules containing such elements.
more »
« less
Fast and Efficient Evaluation of Integrals Arising in Iso-geometric Analysis
Integral equation based analysis of scattering (both acoustic and electromagnetic) is well known and has been studied for several decades. Analysis typically proceeds along the following lines: representation of the geometry using a collection of triangles, representation of physics using piecewise constant basis function defined on each triangle, and then solving the resulting discrete system. In the past few decades, this area has seen several improvements in algorithms that reduce the computational complexity of analysis. But, as we delve into higher order isogeometric analysis (IGA), these algorithms are bogged down by the cost of integration. In this paper, we seek to address this challenge. Our candidate for modeling geometry and physics are subdivision basis sets. The order of these basis sets is sufficiently high to make the challenge apparent. We will present a methodology to ameliorate the cost for both acoustic and electromagnetic integral equations and demonstrate its efficacy.
more »
« less
- Award ID(s):
- 1725278
- PAR ID:
- 10182400
- Date Published:
- Journal Name:
- 2020 International Symposium on Antennas and Propagation
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Finding overcomplete latent representations of data has applications in data analysis, signal processing, machine learning, theoretical neuroscience and many other fields. In an overcomplete representation, the number of latent features exceeds the data dimensionality, which is useful when the data is undersampled by the measurements (compressed sensing or information bottlenecks in neural systems) or composed from multiple complete sets of linear features, each spanning the data space. Independent Components Analysis (ICA) is a linear technique for learning sparse latent representations, which typically has a lower computational cost than sparse coding, a linear generative model which requires an iterative, nonlinear inference step. While well suited for finding complete representations, we show that overcompleteness poses a challenge to existing ICA algorithms. Specifically, the coherence control used in existing ICA and other dictionary learning algorithms, necessary to prevent the formation of duplicate dictionary features, is ill-suited in the overcomplete case. We show that in the overcomplete case, several existing ICA algorithms have undesirable global minima that maximize coherence. We provide a theoretical explanation of these failures and, based on the theory, propose improved coherence control costs for overcomplete ICA algorithms. Further, by comparing ICA algorithms to the computationally more expensive sparse coding on synthetic data, we show that the limited applicability of overcomplete, linear inference can be extended with the proposed cost functions. Finally, when trained on natural images, we show that the coherence control biases the exploration of the data manifold, sometimes yielding suboptimal, coherent solutions. All told, this study contributes new insights into and methods for coherence control for linear ICA, some of which are applicable to many other nonlinear models.more » « less
-
Forecasting ground magnetic field perturbations has been a long-standing goal of the space weather community. The availability of ground magnetic field data and its potential to be used in geomagnetically induced current studies, such as risk assessment, have resulted in several forecasting efforts over the past few decades. One particular community effort was the Geospace Environment Modeling (GEM) challenge of ground magnetic field perturbations that evaluated the predictive capacity of several empirical and first principles models at both mid- and high-latitudes in order to choose an operative model. In this work, we use three different deep learning models-a feed-forward neural network, a long short-term memory recurrent network and a convolutional neural network-to forecast the horizontal component of the ground magnetic field rate of change ( dB H / dt ) over 6 different ground magnetometer stations and to compare as directly as possible with the original GEM challenge. We find that, in general, the models are able to perform at similar levels to those obtained in the original challenge, although the performance depends heavily on the particular storm being evaluated. We then discuss the limitations of such a comparison on the basis that the original challenge was not designed with machine learning algorithms in mind.more » « less
-
It is very important to perform magnetostatic analysis accurately and efficiently when it comes to multi-objective optimization of designs of electromagnetic devices, particularly for inductors, transformers, and electric motors. A kernel free boundary integral method (KFBIM) was studied for analyzing 2D magnetostatic problems. Although KFBIM is accurate and computationally efficient, sharp corners can be a major problem for KFBIM. In this paper, an inverse discrete Fourier transform (DFT) based geometry reconstruction is explored to overcome this challenge for smoothening sharp corners. A toroidal inductor core with an airgap (C-core) is used to show the effectiveness of the proposed approach for addressing the sharp corner problem. A numerical example demonstrates that the method works for the variable coefficient PDE. In addition, magnetostatic analysis for homogeneous and nonhomogeneous material is presented for the reconstructed geometry, and results carried out using KFBIM are compared with the results of FEM analysis for the original geometry to show the differences and the potential of the proposed method.more » « less
-
Over the past two decades, the study of self-similarity and fractality in discrete structures, particularly complex networks, has gained momentum. This surge of interest is fueled by the theoretical developments within the theory of complex networks and the practical demands of real-world applications. Nonetheless, translating the principles of fractal geometry from the domain of general topology, dealing with continuous or infinite objects, to finite structures in a mathematically rigorous way poses a formidable challenge. In this paper, we overview such a theory that allows to identify and analyze fractal networks through the innate methodologies of graph theory and combinatorics. It establishes the direct graph-theoretical analogs of topological (Lebesgue) and fractal (Hausdorff) dimensions in a way that naturally links them to combinatorial parameters that have been studied within the realm of graph theory for decades. This allows to demonstrate that the self-similarity in networks is defined by the patterns of intersection among densely connected network communities. Moreover, the theory bridges discrete and continuous definitions by demonstrating how the combinatorial characterization of Lebesgue dimension via graph representation by its subsets (subgraphs/communities) extends to general topological spaces. Using this framework, we rigorously define fractal networks and connect their properties with established combinatorial concepts, such as graph colorings and descriptive complexity. The theoretical framework surveyed here sets a foundation for applications to real-life networks and future studies of fractal characteristics of complex networks using combinatorial methods and algorithms.more » « less
An official website of the United States government

