Chang, Xiangyu, Li, Yingcong, Oymak, Samet, and Thrampoulidis, Christos. Provable Benefits of Overparameterization in Model Compression: From Double Descent to Pruning Neural Networks. Retrieved from https://par.nsf.gov/biblio/10312032. The Thirty-Fifth AAAI Conference on Artificial Intelligence .
Chang, Xiangyu, Li, Yingcong, Oymak, Samet, & Thrampoulidis, Christos. Provable Benefits of Overparameterization in Model Compression: From Double Descent to Pruning Neural Networks. The Thirty-Fifth AAAI Conference on Artificial Intelligence, (). Retrieved from https://par.nsf.gov/biblio/10312032.
Chang, Xiangyu, Li, Yingcong, Oymak, Samet, and Thrampoulidis, Christos.
"Provable Benefits of Overparameterization in Model Compression: From Double Descent to Pruning Neural Networks". The Thirty-Fifth AAAI Conference on Artificial Intelligence (). Country unknown/Code not available. https://par.nsf.gov/biblio/10312032.
@article{osti_10312032,
place = {Country unknown/Code not available},
title = {Provable Benefits of Overparameterization in Model Compression: From Double Descent to Pruning Neural Networks},
url = {https://par.nsf.gov/biblio/10312032},
abstractNote = {},
journal = {The Thirty-Fifth AAAI Conference on Artificial Intelligence},
author = {Chang, Xiangyu and Li, Yingcong and Oymak, Samet and Thrampoulidis, Christos},
}
Warning: Leaving National Science Foundation Website
You are now leaving the National Science Foundation website to go to a non-government website.
Website:
NSF takes no responsibility for and exercises no control over the views expressed or the accuracy of
the information contained on this site. Also be aware that NSF's privacy policy does not apply to this site.