Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Tnt3o5
/
tnt_v5_lega_new_tokens
like
0
Sentence Similarity
sentence-transformers
Safetensors
roberta
feature-extraction
Generated from Trainer
dataset_size:101442
loss:MatryoshkaLoss
loss:MultipleNegativesRankingLoss
Eval Results
Inference Endpoints
arxiv:
1908.10084
arxiv:
2205.13147
arxiv:
1705.00652
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
tnt_v5_lega_new_tokens
1 contributor
History:
2 commits
Tnt3o5
Add new SentenceTransformer model
c5e862e
verified
about 1 month ago
1_Pooling
Add new SentenceTransformer model
about 1 month ago
.gitattributes
Safe
1.52 kB
initial commit
about 1 month ago
README.md
Safe
45.7 kB
Add new SentenceTransformer model
about 1 month ago
added_tokens.json
Safe
162 kB
Add new SentenceTransformer model
about 1 month ago
bpe.codes
Safe
1.14 MB
Add new SentenceTransformer model
about 1 month ago
config.json
Safe
759 Bytes
Add new SentenceTransformer model
about 1 month ago
config_sentence_transformers.json
Safe
199 Bytes
Add new SentenceTransformer model
about 1 month ago
model.safetensors
Safe
560 MB
LFS
Add new SentenceTransformer model
about 1 month ago
modules.json
Safe
229 Bytes
Add new SentenceTransformer model
about 1 month ago
sentence_bert_config.json
Safe
53 Bytes
Add new SentenceTransformer model
about 1 month ago
special_tokens_map.json
Safe
965 Bytes
Add new SentenceTransformer model
about 1 month ago
tokenizer_config.json
Safe
2.34 MB
Add new SentenceTransformer model
about 1 month ago
vocab.txt
Safe
895 kB
Add new SentenceTransformer model
about 1 month ago