Update README.md
Browse files
README.md
CHANGED
@@ -19,11 +19,20 @@ This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentence
|
|
19 |
|
20 |
Special thanks to [deepset](https://huggingface.co/deepset/) for providing the model gBERT-large and also to [Philip May](https://huggingface.co/philipMay) for the Translation of the dataset and chats about the topic.
|
21 |
|
22 |
-
Model score after fine-tuning
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
23 |
|
24 |
-
**STS-B Test: 0.8626 (Spearman)**
|
25 |
|
26 |
-
This is the best result achieved that I know of.
|
27 |
|
28 |
<!--- Describe your model here -->
|
29 |
|
|
|
19 |
|
20 |
Special thanks to [deepset](https://huggingface.co/deepset/) for providing the model gBERT-large and also to [Philip May](https://huggingface.co/philipMay) for the Translation of the dataset and chats about the topic.
|
21 |
|
22 |
+
Model score after fine-tuning scores best, compared to these models:
|
23 |
+
|
24 |
+
| Model Name | Spearman<br/>German |
|
25 |
+
|---------------------------------------------------------------|-------------------|
|
26 |
+
| xlm-r-distilroberta-base-paraphrase-v1 | 0.8079 |
|
27 |
+
| [xlm-r-100langs-bert-base-nli-stsb-mean-tokens](https://huggingface.co/sentence-transformers/xlm-r-100langs-bert-base-nli-stsb-mean-tokens) | 0.7877 |
|
28 |
+
| xlm-r-bert-base-nli-stsb-mean-tokens | 0.7877 |
|
29 |
+
| [roberta-large-nli-stsb-mean-tokens](https://huggingface.co/sentence-transformers/roberta-large-nli-stsb-mean-tokens) | 0.6371 |
|
30 |
+
| [T-Systems-onsite/<br/>german-roberta-sentence-transformer-v2](https://huggingface.co/T-Systems-onsite/german-roberta-sentence-transformer-v2) | 0.8529 |
|
31 |
+
| [paraphrase-multilingual-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-mpnet-base-v2) | 0.8355 |
|
32 |
+
| [T-Systems-onsite/<br/>cross-en-de-roberta-sentence-transformer](https://huggingface.co/T-Systems-onsite/<br/>cross-en-de-roberta-sentence-transformer) | 0.8550 |
|
33 |
+
| aari1995/German_Semantic_STS_V2 | **0.8626** |
|
34 |
|
|
|
35 |
|
|
|
36 |
|
37 |
<!--- Describe your model here -->
|
38 |
|