Poor Man’s BERT — Why Pruning is Better than Knowledge Distillation ✂️ Medium US, 26 Jul 2020 Jul 26 · 6 min read The ever increasing size of NLP models, and the reduced usability that ensues, is something I’ve discussed…