bert-base-multilingual-cased
bert-multilingual, huggingface.co/bert-base-multilingual-cased, веса, на сайте с December 18, 2022 14:05
BERT multilingual base model (cased)
Pretrained model on the top 104 languages with the largest Wikipedia using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is case sensitive: it makes a difference between english and English.
Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by the Hugging Face team.