Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
We introduce a tunable GAN, called α-GAN, parameterized by α∈(0, ∞], which interpolates between various f-GANs and Integral Probability Metric based GANs (under constrained discriminator set). We construct α− GAN using a supervised loss function, namely, α− loss, which is a tunable loss function capturing several canonical losses. We show that α− GAN is intimately related to the Arimoto divergence, which was first proposed by Österriecher (1996), and later studied by Liese and Vajda (2006). We posit that the holistic understanding that α− GAN introduces will have practical benefits of addressing both the issues of vanishing gradients and mode collapses.more » « less
-
null (Ed.)We analyze the optimization landscape of a recently introduced tunable class of loss functions called α-loss, α ∈ (0, ∞], in the logistic model. This family encapsulates the exponential loss (α = 1/2), the log-loss (α = 1), and the 0-1 loss (α = ∞) and contains compelling properties that enable the practitioner to discern among a host of operating conditions relevant to emerging learning methods. Specifically, we study the evolution of the optimization landscape of α-loss with respect to α using tools drawn from the study of strictly-locally-quasi-convex functions in addition to geometric techniques. We interpret these results in terms of optimization complexity via normalized gradient descent.more » « less
An official website of the United States government
