RoastMe.ru (logo)

huawei-noah/TinyBERT_General_4L_312D

rubert-tiny, на сайте с 04 мая 2023, 18:24
TinyBERT: Distilling BERT for Natural Language Understanding TinyBERT is 7.5x smaller and 9.4x faster on inference than BERT-base and achieves competitive performances in the tasks of natural language understanding. It performs a novel transformer distillation at both the pre-training and task-specific learning stages. In general distillation, we use the original BERT-base without fine-tuning as the teacher and a large-scale text corpus as the learning data.
  • https://huggingface.co/huawei-noah/TinyBERT_General_4L_312D

Загружается...