Update README.md
Browse files
README.md
CHANGED
@@ -32,7 +32,7 @@ For training, we used all Turkish data that was present in the monolingual Turki
|
|
32 |
|
33 |
We tested the performance of **XLMR-MaCoCu-tr** on benchmarks of XPOS, UPOS and NER from the [Universal Dependencies](https://universaldependencies.org/) project. We also tested on a human translated version of the COPA data set (for details see our [Github repo](https://github.com/RikVN/COPA)). We compare performance to the strong multi-lingual models XLMR-base and XLMR-large, but also to the monolingual [BERTurk](https://huggingface.co/dbmdz/bert-base-turkish-cased) model. For details regarding the fine-tuning procedure you can checkout our [Github](https://github.com/macocu/LanguageModels).
|
34 |
|
35 |
-
Scores are averages of three runs, except for COPA, for which we use 10 runs. We use the same hyperparameter settings for all models.
|
36 |
|
37 |
| | **UPOS** | **UPOS** | **XPOS** | **XPOS** | **NER** | **NER** | **COPA** |
|
38 |
|--------------------|:--------:|:--------:|:--------:|:--------:|---------|----------| ----------|
|
|
|
32 |
|
33 |
We tested the performance of **XLMR-MaCoCu-tr** on benchmarks of XPOS, UPOS and NER from the [Universal Dependencies](https://universaldependencies.org/) project. We also tested on a human translated version of the COPA data set (for details see our [Github repo](https://github.com/RikVN/COPA)). We compare performance to the strong multi-lingual models XLMR-base and XLMR-large, but also to the monolingual [BERTurk](https://huggingface.co/dbmdz/bert-base-turkish-cased) model. For details regarding the fine-tuning procedure you can checkout our [Github](https://github.com/macocu/LanguageModels).
|
34 |
|
35 |
+
Scores are averages of three runs, except for COPA, for which we use 10 runs. We use the same hyperparameter settings for all models for POS/NER, for COPA we optimized each learning rate on the dev set.
|
36 |
|
37 |
| | **UPOS** | **UPOS** | **XPOS** | **XPOS** | **NER** | **NER** | **COPA** |
|
38 |
|--------------------|:--------:|:--------:|:--------:|:--------:|---------|----------| ----------|
|