Transformer-Based Language Model Surprisal Predicts Human Reading Times Best with About Two Billion Training Tokens
- Award ID(s):
- 1816891
- PAR ID:
- 10557739
- Publisher / Repository:
- Association for Computational Linguistics
- Date Published:
- Page Range / eLocation ID:
- 1915 to 1921
- Format(s):
- Medium: X
- Location:
- Singapore
- Sponsoring Org:
- National Science Foundation
More Like this
No document suggestions found
An official website of the United States government

