"Extremely Small BERT Models from Mixed-Vocabulary Training."

Sanqiang Zhao et al. (2021)

Details and statistics

DOI: 10.18653/V1/2021.EACL-MAIN.238

access: open

type: Conference or Workshop Paper

metadata version: 2022-01-20

a service of  Schloss Dagstuhl - Leibniz Center for Informatics