Portuguese RoBERTa language model
|Handle:||https://hdl.handle.net/21.11129/0000-000E-631E-2 (persistent URL to this page)|
HuggingFace (pytorch) pre-trained roBERTa model in Portuguese, with 6 layers and 12 attention-heads, totaling 68M parameters. Pre-training was done on 10 million Portuguese sentences and 10 million English sentences from the Oscar corpus.
Santos, Rodrigo, João Rodrigues, António Branco, Rui Vaz, to appear, "Neural Text Categorization with Transformers for learning Portuguese as a Second Language", In Proceedings, EPIA Conference on Artificial Intelligence (EPIA2021).
People who looked at this resource also viewed the following: