Update README.md
Browse files
README.md
CHANGED
@@ -35,7 +35,7 @@ We use [phobert-base-v2](https://github.com/VinAIResearch/PhoBERT) as the pre-tr
|
|
35 |
|
36 |
Here are the results on the remaining 20% of the training set from the Legal Text Retrieval Zalo 2021 challenge:
|
37 |
|
38 |
-
| Pretrained Model |
|
39 |
|-------------------------------|---------------------------------------|:------------:|:-------------:|:--------------:|:-------------:|:-------------:|
|
40 |
| [Vietnamese-SBERT](https://huggingface.co/keepitreal/vietnamese-sbert) | - | 32.34 | 52.97 | 89.84 | 7.05 | 45.30 |
|
41 |
| PhoBERT-base-v2 | MSMACRO | 47.81 | 77.19 | 92.34 | 7.72 | 58.37 |
|
|
|
35 |
|
36 |
Here are the results on the remaining 20% of the training set from the Legal Text Retrieval Zalo 2021 challenge:
|
37 |
|
38 |
+
| Pretrained Model | Training Datasets | Acc@1 | Acc@10 | Acc@100 | Pre@10 | MRR@10 |
|
39 |
|-------------------------------|---------------------------------------|:------------:|:-------------:|:--------------:|:-------------:|:-------------:|
|
40 |
| [Vietnamese-SBERT](https://huggingface.co/keepitreal/vietnamese-sbert) | - | 32.34 | 52.97 | 89.84 | 7.05 | 45.30 |
|
41 |
| PhoBERT-base-v2 | MSMACRO | 47.81 | 77.19 | 92.34 | 7.72 | 58.37 |
|