Text2Text Generation
Transformers
PyTorch
t5
codet5
text-generation-inference
doyensahoo yangwenz commited on
Commit
b1ee957
·
verified ·
1 Parent(s): a642dc9

Update README.md (#5)

Browse files

- Update README.md (7b0193db19ed5be6de242c09be4cd05be05b5092)


Co-authored-by: Wenzhuo Yang <[email protected]>

Files changed (1) hide show
  1. README.md +4 -0
README.md CHANGED
@@ -65,6 +65,10 @@ This model uses a code-specific BPE (Byte-Pair Encoding) tokenizer. One can prep
65
 
66
  For evaluation results on several downstream benchmarks, we refer to the paper.
67
 
 
 
 
 
68
  ### BibTeX entry and citation info
69
 
70
  ```bibtex
 
65
 
66
  For evaluation results on several downstream benchmarks, we refer to the paper.
67
 
68
+ ## Ethical Considerations
69
+
70
+ This release is for research purposes only in support of an academic paper. Our models, datasets, and code are not specifically designed or evaluated for all downstream purposes. We strongly recommend users evaluate and address potential concerns related to accuracy, safety, and fairness before deploying this model. We encourage users to consider the common limitations of AI, comply with applicable laws, and leverage best practices when selecting use cases, particularly for high-risk scenarios where errors or misuse could significantly impact people’s lives, rights, or safety. For further guidance on use cases, refer to our AUP and AI AUP.
71
+
72
  ### BibTeX entry and citation info
73
 
74
  ```bibtex