Xie, Shuo, Qiu, Jiahao, Pasad, Ankita, Du, Li, Qu, Qing, and Mei, Hongyuan. Hidden state variability of pretrained language models can guide computation reduction for transfer learning. Retrieved from https://par.nsf.gov/biblio/10394121. The 2022 Conference on Empirical Methods in Natural Language Processing .
Xie, Shuo, Qiu, Jiahao, Pasad, Ankita, Du, Li, Qu, Qing, & Mei, Hongyuan. Hidden state variability of pretrained language models can guide computation reduction for transfer learning. The 2022 Conference on Empirical Methods in Natural Language Processing, (). Retrieved from https://par.nsf.gov/biblio/10394121.
Xie, Shuo, Qiu, Jiahao, Pasad, Ankita, Du, Li, Qu, Qing, and Mei, Hongyuan.
"Hidden state variability of pretrained language models can guide computation reduction for transfer learning". The 2022 Conference on Empirical Methods in Natural Language Processing (). Country unknown/Code not available. https://par.nsf.gov/biblio/10394121.
@article{osti_10394121,
place = {Country unknown/Code not available},
title = {Hidden state variability of pretrained language models can guide computation reduction for transfer learning},
url = {https://par.nsf.gov/biblio/10394121},
abstractNote = {},
journal = {The 2022 Conference on Empirical Methods in Natural Language Processing},
author = {Xie, Shuo and Qiu, Jiahao and Pasad, Ankita and Du, Li and Qu, Qing and Mei, Hongyuan},
}
Warning: Leaving National Science Foundation Website
You are now leaving the National Science Foundation website to go to a non-government website.
Website:
NSF takes no responsibility for and exercises no control over the views expressed or the accuracy of
the information contained on this site. Also be aware that NSF's privacy policy does not apply to this site.