AgaMiko commited on
Commit
6d6e3cb
1 Parent(s): 7dd2ec8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -5,7 +5,7 @@ language:
5
  datasets:
6
  - Wikipedia
7
  tags:
8
- - sentence similarity
9
  ---
10
  # SHerbert - Polish SentenceBERT
11
  SentenceBERT is a modification of the pretrained BERT network that use siamese and triplet network structures to derive semantically meaningful sentence embeddings that can be compared using cosine-similarity. Training was based on the original paper [Siamese BERT models for the task of semantic textual similarity (STS)](https://arxiv.org/abs/1908.10084) with a slight modification of how the training data was used. The goal of the model is to generate different embeddings based on the semantic and topic similarity of the given text.
 
5
  datasets:
6
  - Wikipedia
7
  tags:
8
+ - sentence-similarity
9
  ---
10
  # SHerbert - Polish SentenceBERT
11
  SentenceBERT is a modification of the pretrained BERT network that use siamese and triplet network structures to derive semantically meaningful sentence embeddings that can be compared using cosine-similarity. Training was based on the original paper [Siamese BERT models for the task of semantic textual similarity (STS)](https://arxiv.org/abs/1908.10084) with a slight modification of how the training data was used. The goal of the model is to generate different embeddings based on the semantic and topic similarity of the given text.