Wang, H., Gurbuzbalaban, Mert, Zhu, L., Simsekli, U, and Erdogdu, M. A. Convergence Rates of Stochastic Gradient Descent under Infinite Noise Variance. Retrieved from https://par.nsf.gov/biblio/10326421. Advances in neural information processing systems 34.
Wang, H., Gurbuzbalaban, Mert, Zhu, L., Simsekli, U, & Erdogdu, M. A. Convergence Rates of Stochastic Gradient Descent under Infinite Noise Variance. Advances in neural information processing systems, 34 (). Retrieved from https://par.nsf.gov/biblio/10326421.
Wang, H., Gurbuzbalaban, Mert, Zhu, L., Simsekli, U, and Erdogdu, M. A.
"Convergence Rates of Stochastic Gradient Descent under Infinite Noise Variance". Advances in neural information processing systems 34 (). Country unknown/Code not available. https://par.nsf.gov/biblio/10326421.
@article{osti_10326421,
place = {Country unknown/Code not available},
title = {Convergence Rates of Stochastic Gradient Descent under Infinite Noise Variance},
url = {https://par.nsf.gov/biblio/10326421},
abstractNote = {},
journal = {Advances in neural information processing systems},
volume = {34},
author = {Wang, H. and Gurbuzbalaban, Mert and Zhu, L. and Simsekli, U and Erdogdu, M. A},
}
Warning: Leaving National Science Foundation Website
You are now leaving the National Science Foundation website to go to a non-government website.
Website:
NSF takes no responsibility for and exercises no control over the views expressed or the accuracy of
the information contained on this site. Also be aware that NSF's privacy policy does not apply to this site.