![](https://dblp.org/img/logo.ua.320x120.png)
![](https://dblp.org/img/dropdown.dark.16x16.png)
![](https://dblp.org/img/peace.dark.16x16.png)
Остановите войну!
for scientists:
![search dblp search dblp](https://dblp.org/img/search.dark.16x16.png)
![search dblp](https://dblp.org/img/search.dark.16x16.png)
default search action
"MiniLMv2: Multi-Head Self-Attention Relation Distillation for Compressing ..."
Wenhui Wang et al. (2020)
- Wenhui Wang, Hangbo Bao, Shaohan Huang, Li Dong, Furu Wei:
MiniLMv2: Multi-Head Self-Attention Relation Distillation for Compressing Pretrained Transformers. CoRR abs/2012.15828 (2020)
![](https://dblp.org/img/cog.dark.24x24.png)
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.