Neural networks have demonstrated impressive success in various domains, raising the question of what fundamental principles underlie the effectiveness of the best AI systems and quite possibly of human intelligence. This perspective argues that compositional sparsity, or the property that a compositional function have “few” constituent functions, each depending on only a small subset of inputs, is a key principle underlying successful learning architectures. Surprisingly, all functions that are efficiently Turing computable have a compositional sparse representation. Furthermore, deep networks that are also sparse can exploit this general property to avoid the “curse of dimensionality”. This framework suggests interesting implications about the role that machine learning may play in mathematics.
more »
« less
Not All Learnable Distribution Classes are Privately Learnable
- Award ID(s):
- 2046425
- PAR ID:
- 10524919
- Publisher / Repository:
- International Conference on Algorithmic Learning Theory
- Date Published:
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
We propose a new graph neural network (GNN) module, based on relaxations of recently proposed geometric scattering transforms, which consist of a cascade of graph wavelet filters. Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations. The incorporation of our LEGS-module in GNNs enables the learning of longer-range graph relations compared to many popular GNNs, which often rely on encoding graph structure via smoothness or similarity between neighbors. Further, its wavelet priors result in simplified architectures with significantly fewer learned parameters compared to competing GNNs. We demonstrate the predictive performance of LEGS-based networks on graph classification benchmarks, as well as the descriptive quality of their learned features in biochemical graph data exploration tasks. Our results show that LEGS-based networks match or outperforms popular GNNs, as well as the original geometric scattering construction, on many datasets, in particular in biochemical domains, while retaining certain mathematical properties of handcrafted (non-learned) geometric scattering.more » « less
-
Goda, Keisuke; Tsia, Kevin K. (Ed.)We present a new deep compressed imaging modality by scanning a learned illumination pattern on the sample and detecting the signal with a single-pixel detector. This new imaging modality allows a compressed sampling of the object, and thus a high imaging speed. The object is reconstructed through a deep neural network inspired by compressed sensing algorithm. We optimize the illumination pattern and the image reconstruction network by training an end-to-end auto-encoder framework. Comparing with the conventional single-pixel camera and point-scanning imaging system, we accomplish a high-speed imaging with a reduced light dosage, while preserving a high imaging quality.more » « less
An official website of the United States government

