Portuguese RoBERTa language model

RoBERTa_PT

HuggingFace (pytorch) pre-trained roBERTa model in Portuguese, with 6 layers and 12 attention-heads, totaling 68M parameters. Pre-training was done on 10 million Portuguese sentences and 10 million English sentences from the Oscar corpus.

Please cite:

Santos, Rodrigo, João Rodrigues, António Branco, Rui Vaz, to appear, "Neural Text Categorization with Transformers for learning Portuguese as a Second Language", In Proceedings, EPIA Conference on Artificial Intelligence (EPIA2021).

Download

  • Language Description
  • text
People who looked at this resource also viewed the following: