Update README.md
Browse files
README.md
CHANGED
@@ -56,7 +56,7 @@ This pretraining data will not be opened to public due to Twitter policy.
|
|
56 |
| `code-mixed-ijebert` | BERT | 2.24 GB of text | 249 MB of text |
|
57 |
|
58 |
## Evaluation Results
|
59 |
-
We train the data with 3 epochs and total steps of
|
60 |
The following are the results obtained from the training:
|
61 |
|
62 |
| train loss | eval loss | eval perplexity |
|
|
|
56 |
| `code-mixed-ijebert` | BERT | 2.24 GB of text | 249 MB of text |
|
57 |
|
58 |
## Evaluation Results
|
59 |
+
We train the data with 3 epochs and total steps of 296K for 12 days.
|
60 |
The following are the results obtained from the training:
|
61 |
|
62 |
| train loss | eval loss | eval perplexity |
|