VladimirVorobev
commited on
Commit
•
e91bb19
1
Parent(s):
82820e0
Update README.md
Browse files
README.md
CHANGED
@@ -36,7 +36,7 @@ This model is based on the T5-base model. We used "transfer learning" to get our
|
|
36 |
|
37 |
[Kaggle](https://www.kaggle.com/datasets/vladimirvorobevv/chatgpt-paraphrases) link
|
38 |
|
39 |
-
|
40 |
```python
|
41 |
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
|
42 |
|
@@ -76,7 +76,7 @@ def paraphrase(
|
|
76 |
return res
|
77 |
```
|
78 |
|
79 |
-
|
80 |
|
81 |
**Input:**
|
82 |
```python
|
@@ -107,7 +107,7 @@ paraphrase(text)
|
|
107 |
```
|
108 |
|
109 |
|
110 |
-
|
111 |
```python
|
112 |
epochs = 5
|
113 |
batch_size = 64
|
@@ -116,9 +116,11 @@ lr = 5e-5
|
|
116 |
batches_qty = 196465
|
117 |
betas = (0.9, 0.999)
|
118 |
eps = 1e-08
|
119 |
-
|
120 |
-
```
|
121 |
```
|
|
|
|
|
|
|
|
|
122 |
@inproceedings{chatgpt_paraphraser,
|
123 |
author={Vladimir Vorobev, Maxim Kuznetsov},
|
124 |
title={A paraphrasing model based on ChatGPT paraphrases},
|
|
|
36 |
|
37 |
[Kaggle](https://www.kaggle.com/datasets/vladimirvorobevv/chatgpt-paraphrases) link
|
38 |
|
39 |
+
## Deploying example
|
40 |
```python
|
41 |
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
|
42 |
|
|
|
76 |
return res
|
77 |
```
|
78 |
|
79 |
+
## Usage examples
|
80 |
|
81 |
**Input:**
|
82 |
```python
|
|
|
107 |
```
|
108 |
|
109 |
|
110 |
+
## Train parameters
|
111 |
```python
|
112 |
epochs = 5
|
113 |
batch_size = 64
|
|
|
116 |
batches_qty = 196465
|
117 |
betas = (0.9, 0.999)
|
118 |
eps = 1e-08
|
|
|
|
|
119 |
```
|
120 |
+
|
121 |
+
### BibTeX entry and citation info
|
122 |
+
|
123 |
+
```bibtex
|
124 |
@inproceedings{chatgpt_paraphraser,
|
125 |
author={Vladimir Vorobev, Maxim Kuznetsov},
|
126 |
title={A paraphrasing model based on ChatGPT paraphrases},
|