We develop techniques to construct explicit symplectic Lefschetz fibrations over the2-sphere with any prescribed signature\sigmaand any spin type when\sigmais divisible by16. This solves a long-standing conjecture on the existence of such fibrations with positive signature. As applications, we produce symplectic4-manifolds that are homeomorphic but not diffeomorphic to connected sums ofS^2 \times S^2, with the smallest topology known to date, as well as larger examples as symplectic Lefschetz fibrations.
more »
« less
Explicit Analytic Solution for the Plane Elastostatic Problem with a Rigid Inclusion of Arbitrary Shape Subject to Arbitrary Far-Field Loadings
- Award ID(s):
- 2008105
- PAR ID:
- 10249657
- Date Published:
- Journal Name:
- Journal of Elasticity
- Volume:
- 144
- Issue:
- 1
- ISSN:
- 0374-3535
- Page Range / eLocation ID:
- 81 to 105
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Modeling distributions of covariates, or density estimation, is a core challenge in unsupervised learning. However, the majority of work only considers the joint distribution, which has limited utility in practical situations. A more general and useful problem is arbitrary conditional density estimation, which aims to model any possible conditional distribution over a set of covariates, reflecting the more realistic setting of inference based on prior knowledge. We propose a novel method, Arbitrary Conditioning with Energy (ACE), that can simultaneously estimate the distribution p(x_u | x_o) for all possible subsets of unobserved features x_u and observed features x_o. ACE is designed to avoid unnecessary bias and complexity — we specify densities with a highly expressive energy function and reduce the problem to only learning one-dimensional conditionals (from which more complex distributions can be recovered during inference). This results in an approach that is both simpler and higher-performing than prior methods. We show that ACE achieves state-of-the-art for arbitrary conditional likelihood estimation and data imputation on standard benchmarks.more » « less
-
Bounds on the smallest eigenvalue of the neural tangent kernel (NTK) are a key ingredient in the analysis of neural network optimization and memorization. How- ever, existing results require distributional assumptions on the data and are limited to a high-dimensional setting, where the input dimension d0 scales at least log- arithmically in the number of samples n. In this work we remove both of these requirements and instead provide bounds in terms of a measure of distance between data points: notably these bounds hold with high probability even when d0 is held constant versus n. We prove our results through a novel application of the hemisphere transform.more » « less
An official website of the United States government

