microsoft/unilm
here, LayoutReader, microsoft/unilm official, unilm, на сайте с December 18, 2022 15:50
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities.
Fundamental research to improve modeling generality and capability, as well as training stability and efficiency for Transformers at any scale.