Old Web
English
Sign In
Acemap
>
Paper
>
CTR-BERT: Cost-effective knowledge distillation for billion-parameter teacher models
CTR-BERT: Cost-effective knowledge distillation for billion-parameter teacher models
2021
Aashiq Muhamed
Iman Keivanloo
Sujan Perera
James Mracek
Yi Xu
Qingjun Cui
Santosh Rajagopalan
Belinda Zeng
Trishul Chilimbi
Keywords:
Process engineering
Distillation
Computer science
Correction
Source
Cite
Save
Machine Reading By IdeaReader
6
References
0
Citations
NaN
KQI
[]
map