Portuguese RoBERTa language model

HuggingFace (pytorch) pre-trained roBERTa model in Portuguese, with 6 layers and 12 attention-heads, totaling 68M parameters. Pre-training was done on 10 million Portuguese sentences and 10 million English sentences from the Oscar corpus. Please cite: Santos, Rodrigo, João Rodrigues, Antóni...

Resource Type:Language Description
Media Type:Text
Languages:English
Portuguese
Gervásio PT-BR base

Gervásio PT-* is a foundation, large language model for the Portuguese language. It is a decoder of the GPT family, based on the neural architecture Transformer and developed over the Pythia model, with competitive performance for this language. It has different versions that were trained for ...

Resource Type:Language Description
Media Type:Text
Language:Portuguese
Gervásio PT-PT base

Gervásio PT-* is a foundation, large language model for the Portuguese language. It is a decoder of the GPT family, based on the neural architecture Transformer and developed over the Pythia model, with competitive performance for this language. It has different versions that were trained for ...

Resource Type:Language Description
Media Type:Text
Language:Portuguese

Order by: