jarodrigues
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -85,7 +85,7 @@ Please use the above cannonical reference when using or citing this model.
|
|
85 |
|
86 |
**This model card is for Gervásio 7B PT-BR**, with 7 billion parameters, a hidden size of 4096 units, an intermediate size of 11,008 units, 32 attention heads, 32 hidden layers, and a tokenizer obtained using the Byte-Pair Encoding (BPE) algorithm implemented with SentencePiece, featuring a vocabulary size of 32,000.
|
87 |
|
88 |
-
Gervásio-7B-PTBR-Decoder is distributed under an [MIT license](https://huggingface.co/PORTULAN/gervasio-7b-portuguese-
|
89 |
|
90 |
|
91 |
<br>
|
|
|
85 |
|
86 |
**This model card is for Gervásio 7B PT-BR**, with 7 billion parameters, a hidden size of 4096 units, an intermediate size of 11,008 units, 32 attention heads, 32 hidden layers, and a tokenizer obtained using the Byte-Pair Encoding (BPE) algorithm implemented with SentencePiece, featuring a vocabulary size of 32,000.
|
87 |
|
88 |
+
Gervásio-7B-PTBR-Decoder is distributed under an [MIT license](https://huggingface.co/PORTULAN/gervasio-7b-portuguese-ptpt-decoder/blob/main/LICENSE).
|
89 |
|
90 |
|
91 |
<br>
|