BERTimbau - Portuguese BERT-Large language model

This resource contains a pre-trained BERT language model trained on the Portuguese language. A BERT-Large cased variant was trained on the BrWaC (Brazilian Web as Corpus), a large Portuguese corpus, for 1,000,000 steps, using whole-word mask.
The model is available as artifacts for TensorFlow and for PyTorch.

Contact Resource Maintainer
  • Language Description
  • text

People who looked at this resource also viewed the following: