"Knowledge Distillation as Efficient Pre-training: Faster Convergence, ..."

Ruifei He et al. (2022)

Details and statistics

DOI: 10.1109/CVPR52688.2022.00895

access: closed

type: Conference or Workshop Paper

metadata version: 2024-03-20

a service of  Schloss Dagstuhl - Leibniz Center for Informatics