This content will become publicly available on May 6, 2025
Accelerating Large Language Model Training with Hybrid GPU-based Compression
- Award ID(s):
- 2312927
- PAR ID:
- 10524862
- Publisher / Repository:
- IEEE/ACM
- Date Published:
- Page Range / eLocation ID:
- 1 to 9
- Format(s):
- Medium: X
- Location:
- Philadelphia, USA
- Sponsoring Org:
- National Science Foundation
More Like this
No document suggestions found