BERTimbau - Portuguese BERT-Base language model
Handle: | https://hdl.handle.net/21.11129/0000-000E-6726-4 (persistent URL to this page) |
---|---|
URL: | https://github.com/neuralmind-ai/portuguese-bert/ |
This resource contains a pre-trained BERT language model trained on the Portuguese language. A BERT-Base cased variant was trained on the BrWaC (Brazilian Web as Corpus), a large Portuguese corpus, for 1,000,000 steps, using whole-word mask.
The model is available as artifacts for TensorFlow and for PyTorch.
People who looked at this resource also viewed the following: