skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Award ID contains: 2046816

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract In this paper, we study the problem of learning the weights of a deep convolutional neural network. We consider a network where convolutions are carried out over non-overlapping patches. We develop an algorithm for simultaneously learning all the kernels from the training data. Our approach dubbed deep tensor decomposition (DeepTD) is based on a low-rank tensor decomposition. We theoretically investigate DeepTD under a realizable model for the training data where the inputs are chosen i.i.d. from a Gaussian distribution and the labels are generated according to planted convolutional kernels. We show that DeepTD is sample efficient and provably works as soon as the sample size exceeds the total number of convolutional weights in the network. 
    more » « less
  2. Free, publicly-accessible full text available July 15, 2026
  3. Free, publicly-accessible full text available June 1, 2026
  4. Free, publicly-accessible full text available May 15, 2026
  5. Free, publicly-accessible full text available May 1, 2026
  6. Free, publicly-accessible full text available February 25, 2026
  7. Free, publicly-accessible full text available February 25, 2026
  8. Free, publicly-accessible full text available December 31, 2025
  9. Free, publicly-accessible full text available December 31, 2025
  10. Free, publicly-accessible full text available December 31, 2025