RoastMe.ru (logo)

Knowledge distillation

Дистилляция, Дистилляция , на сайте с December 18, 2022 14:02
In machine learning, knowledge distillation is the process of transferring knowledge from a large model to a smaller one. While large models have higher knowledge capacity than small models, this capacity might not be fully utilized.
  • https://arxiv.org/abs/1503.02531

Загружается...