Fu, Cheng, Zhu, Shilin, Su, Hao, Lee, Ching-En, and Zhao, Jishen. Towards Fast and Energy-Efficient Binarized Neural Network Inference on FPGA. Retrieved from https://par.nsf.gov/biblio/10109226. FPGA . Web. doi:10.1145/3289602.3293990.
Fu, Cheng, Zhu, Shilin, Su, Hao, Lee, Ching-En, & Zhao, Jishen. Towards Fast and Energy-Efficient Binarized Neural Network Inference on FPGA. FPGA, (). Retrieved from https://par.nsf.gov/biblio/10109226. https://doi.org/10.1145/3289602.3293990
@article{osti_10109226,
place = {Country unknown/Code not available},
title = {Towards Fast and Energy-Efficient Binarized Neural Network Inference on FPGA},
url = {https://par.nsf.gov/biblio/10109226},
DOI = {10.1145/3289602.3293990},
abstractNote = {},
journal = {FPGA},
author = {Fu, Cheng and Zhu, Shilin and Su, Hao and Lee, Ching-En and Zhao, Jishen},
}
Warning: Leaving National Science Foundation Website
You are now leaving the National Science Foundation website to go to a non-government website.
Website:
NSF takes no responsibility for and exercises no control over the views expressed or the accuracy of
the information contained on this site. Also be aware that NSF's privacy policy does not apply to this site.