Michael Beukman
commited on
Commit
•
359b690
1
Parent(s):
354e291
Slightly improved model card
Browse files
README.md
CHANGED
@@ -68,8 +68,24 @@ In general, this model performed worse on the 'date' category compared to others
|
|
68 |
Here are some performance details on this specific model, compared to others we trained.
|
69 |
All of these metrics were calculated on the test set, and the seed was chosen that gave the best overall F1 score. The first three result columns are averaged over all categories, and the latter 4 provide performance broken down by category.
|
70 |
|
|
|
71 |
|
72 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
73 |
| -------------------------------------------------- | -------------------- | -------------------- | -------------- | -------------- | -------------- | -------------- | -------------- | -------------- | -------------- |
|
74 |
| [xlm-roberta-base-finetuned-swahili-finetuned-ner-yoruba](https://huggingface.co/mbeukman/xlm-roberta-base-finetuned-swahili-finetuned-ner-yoruba) (This model) | [swa](https://huggingface.co/Davlan/xlm-roberta-base-finetuned-swahili) | yor | 80.29 | 78.34 | 82.35 | 77.00 | 82.00 | 73.00 | 86.00 |
|
75 |
| [xlm-roberta-base-finetuned-yoruba-finetuned-ner-yoruba](https://huggingface.co/mbeukman/xlm-roberta-base-finetuned-yoruba-finetuned-ner-yoruba) | [yor](https://huggingface.co/Davlan/xlm-roberta-base-finetuned-yoruba) | yor | 83.68 | 79.92 | 87.82 | 78.00 | 86.00 | 74.00 | 92.00 |
|
|
|
68 |
Here are some performance details on this specific model, compared to others we trained.
|
69 |
All of these metrics were calculated on the test set, and the seed was chosen that gave the best overall F1 score. The first three result columns are averaged over all categories, and the latter 4 provide performance broken down by category.
|
70 |
|
71 |
+
These models can predict the following label for a token ([source](https://huggingface.co/Davlan/xlm-roberta-large-masakhaner)):
|
72 |
|
73 |
+
|
74 |
+
Abbreviation|Description
|
75 |
+
-|-
|
76 |
+
O|Outside of a named entity
|
77 |
+
B-DATE |Beginning of a DATE entity right after another DATE entity
|
78 |
+
I-DATE |DATE entity
|
79 |
+
B-PER |Beginning of a person’s name right after another person’s name
|
80 |
+
I-PER |Person’s name
|
81 |
+
B-ORG |Beginning of an organisation right after another organisation
|
82 |
+
I-ORG |Organisation
|
83 |
+
B-LOC |Beginning of a location right after another location
|
84 |
+
I-LOC |Location
|
85 |
+
|
86 |
+
|
87 |
+
|
88 |
+
| Model Name | Staring point | Evaluation / Fine-tune Language | F1 | Precision | Recall | F1 (DATE) | F1 (LOC) | F1 (ORG) | F1 (PER) |
|
89 |
| -------------------------------------------------- | -------------------- | -------------------- | -------------- | -------------- | -------------- | -------------- | -------------- | -------------- | -------------- |
|
90 |
| [xlm-roberta-base-finetuned-swahili-finetuned-ner-yoruba](https://huggingface.co/mbeukman/xlm-roberta-base-finetuned-swahili-finetuned-ner-yoruba) (This model) | [swa](https://huggingface.co/Davlan/xlm-roberta-base-finetuned-swahili) | yor | 80.29 | 78.34 | 82.35 | 77.00 | 82.00 | 73.00 | 86.00 |
|
91 |
| [xlm-roberta-base-finetuned-yoruba-finetuned-ner-yoruba](https://huggingface.co/mbeukman/xlm-roberta-base-finetuned-yoruba-finetuned-ner-yoruba) | [yor](https://huggingface.co/Davlan/xlm-roberta-base-finetuned-yoruba) | yor | 83.68 | 79.92 | 87.82 | 78.00 | 86.00 | 74.00 | 92.00 |
|