Thacio Garcia Scandaroli commited on
Commit
d05f802
1 Parent(s): 3b27a9c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -14
README.md CHANGED
@@ -41,7 +41,11 @@ Utilizou-se uma janela de contexto para 1024 tokens e um tokenizador do GPT2 com
41
 
42
  ## Uses
43
 
44
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
 
 
 
 
45
 
46
  Exemplo de geração de texto com top_k de 30
47
 
@@ -89,19 +93,6 @@ print(last_hidden_states)
89
  # 5.6911e-02, 1.2650e-01]]], grad_fn=<MulBackward0>)
90
  ```
91
 
92
- ### Direct Use
93
-
94
- <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
95
-
96
- [More Information Needed]
97
-
98
-
99
- ### Out-of-Scope Use
100
-
101
- <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
102
-
103
- [More Information Needed]
104
-
105
  ## Bias, Risks, and Limitations
106
 
107
  <!-- This section is meant to convey both technical and sociotechnical limitations. -->
 
41
 
42
  ## Uses
43
 
44
+ O uso recomendado é para fine-tunning.
45
+
46
+ Foi disponibilizado um tutorial em formato de notebook para fine-tune de modelos decoder e encoder-decoder (T5): [Fine-tune Large Language Models](endereço aqui)
47
+
48
+ ### Direct Use
49
 
50
  Exemplo de geração de texto com top_k de 30
51
 
 
93
  # 5.6911e-02, 1.2650e-01]]], grad_fn=<MulBackward0>)
94
  ```
95
 
 
 
 
 
 
 
 
 
 
 
 
 
 
96
  ## Bias, Risks, and Limitations
97
 
98
  <!-- This section is meant to convey both technical and sociotechnical limitations. -->