File size: 52,653 Bytes
47045ec
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
---
base_model: BAAI/bge-small-en-v1.5
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@5
- cosine_ndcg@10
- cosine_ndcg@100
- cosine_mrr@5
- cosine_mrr@10
- cosine_mrr@100
- cosine_map@100
- dot_accuracy@1
- dot_accuracy@5
- dot_accuracy@10
- dot_precision@1
- dot_precision@5
- dot_precision@10
- dot_recall@1
- dot_recall@5
- dot_recall@10
- dot_ndcg@5
- dot_ndcg@10
- dot_ndcg@100
- dot_mrr@5
- dot_mrr@10
- dot_mrr@100
- dot_map@100
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:563
- loss:GISTEmbedLoss
widget:
- source_sentence: Can I pay for parking using digital payment methods like UPI, credit/debit
    cards, or mobile wallets?
  sentences:
  - The vibrant colors of autumn leaves create a breathtaking tapestry across the
    landscape, reminding us of nature's artistry. Many people enjoy taking strolls
    through parks to appreciate the crisp air and the sound of crunching leaves underfoot.
    Some choose to photograph the scenery, capturing fleeting moments of beauty, while
    others might indulge in seasonal treats like pumpkin spice lattes. Embracing the
    change in seasons also encourages us to reflect on personal growth and the passage
    of time as we move towards the winter months.
  - Yes, most parking areas accept digital payment methods such as UPI, credit/debit
    cards, or mobile wallets to facilitate cashless transactions. However, it is recommended
    to carry some cash as a backup because digital payments might not always work
    due to network issues and high crowd density during peak times.
  - Mahakumbh 2025 will start on 13 January with the Paush Purnima bath and end on
    26 February with the Mahashivratri bath.
- source_sentence: What is Aarti
  sentences:
  - No, shuttle buses will not have dedicated volunteers specifically, but for assistance,
    you can reach out to the nearest information center.
  - "In India, since ancient times, rivers are worshipped due to their importance\
    \ to the human life. \n\nLikewise, in Tirathraj Prayagraj, Aartis’ are performed\
    \ on the banks of Ganga, Yamuna and at Sangam with great admiration, deep-rooted\
    \ honor and devotion. In Prayagraj, Prayagraj Mela Authority and various other\
    \ communities make grand arrangements for these Aartis.\n\nThe Aartis are performed\
    \ in the mornings and evenings, in which priests (Batuks), normally 5 to 7 in\
    \ number, chant hymns with great fervor, holding meticulously designed lamps and\
    \ worship the rivers with utmost devotion. \n\nThe lamps held by the batuks represent\
    \ the importance of panchtatva. On one hand, flames of the lamps signify bowing\
    \ to the waters of the sacred rivers and on the other, the holy fumes emanating\
    \ from the lamps appear to play the mystic of heaven on earth."
  - 'In the realm of celestial bodies, the moons of Jupiter captivate astronomers
    with their striking variations. These natural satellites exhibit a diverse range
    of landscapes, from the icy crust of Europa to the volcanic surface of Io, each
    revealing secrets about the formation of our solar system.


    In laboratories around the world, researchers utilize advanced telescopes, funded
    by international space agencies, to monitor these moons, collecting data that
    aids in understanding their geological processes. They examine topographical maps
    and analyze spectrographs, revealing rich insights into the chemical compositions
    present on these distant worlds.


    Collaborations between scientists and institutions have led to remarkable discoveries,
    including the potential for subsurface oceans beneath the icy shell of Europa,
    stirring excitement about the possibility of extraterrestrial life. Meanwhile,
    rumors of missions planned to explore these enigmatic moons intensify interest
    in the ongoing quest for knowledge beyond our home planet.'
- source_sentence: Which all companies offer tour services?
  sentences:
  - There are no specific facilities exclusively for senior citizens at the Railway
    Junction in relation to the Mela. However, most railway stations generally offer
    basic amenities like wheelchairs, assistance for boarding and de-boarding, and
    special seating areas for senior citizens or those with mobility issues. It is
    advisable for senior citizens to check with the railway authorities for any additional
    support that might be available during the Mela.
  - The art of origami has captivated many enthusiasts around the world. Crafting
    intricate designs from simple sheets of paper showcases creativity and precision.
    Essential tools include sharp scissors, bone folders, and high-quality paper to
    achieve the best results. Workshops often focus on advanced techniques, leading
    to beautiful decorative pieces and useful items, enhancing the enjoyment of this
    timeless craft.
  - All information provided here includes tour services provided by UPSTDC (Uttar
    Pradesh State Tourism Development Corporation). Additionally, popular platforms
    like MakeMyTrip and other travel websites offer their own tour packages for Kumbh
    Mela and nearby attractions. For a wider range of options, you can check these
    services directly on their websites to find a tour that best suits your needs.
- source_sentence: From when to when is the Mela?
  sentences:
  - "Mahakumbh Mela 2025 will begin on 13 January with the Paush Purnima bath and\
    \ will conclude on 26 February with the Mahashivratri bath.\n \n While every day\
    \ during the Mahakumbh is considered auspicious for bathing, the main bathing\
    \ festivals are as follows:\n \n 1. Paush Purnima – 13 January\n 2. Makar Sankranti\
    \ – 14 January\n 3. Mauni Amavasya – 29 January\n 4. Vasant Panchami – 3 February\n\
    \ 5. Maghi Purnima – 12 February\n 6. Mahashivratri – 26 February\n \n Out of\
    \ these, three dates are Shahi Snan festivals, when the Akharas and saints proceed\
    \ with grand processions for the bath:\n \n 1. Makar Sankranti – 14 January\n\
    \ 2. Mauni Amavasya – 29 January\n 3. Vasant Panchami – 3 February"
  - 'The sky today is filled with vibrant clouds, where shades of orange and pink
    blend seamlessly into vast expanses of blue. The wind carries the sounds of distant
    laughter, as children chase each other through sprawling fields of lush green
    grass. Nearby, an old oak tree stands tall, its branches swaying gently and offering
    shade to those seeking respite from the warmth of the sun.


    A stream meanders through the landscape, its clear waters reflecting the brilliant
    hues of the sky above. Dragonflies dart about, their iridescent wings catching
    the light as they flit from flower to flower. In the distance, a family prepares
    a picnic, the aroma of freshly baked bread mingling with the sweet scent of blooming
    wildflowers.


    As the afternoon stretches on, the sun begins its slow descent, painting the horizon
    in richer tones. The air is filled with a sense of peace and joy, moments warm
    with the laughter of friends and the thrill of nature''s beauty all around.'
  - No, there is no special bus service specifically for women or families traveling
    from the Bus Stand to the Mela. Shuttle buses would be available with fixed timings
    and route plans which offer convenient travel
- source_sentence: What is the ritual of Snan or bathing?
  sentences:
  - Yes, luggage porter services are available at Prayagraj Junction for pilgrims
    heading to the Mela. These porters, often referred to as coolies
  - 'Taking bath at the confluence of Ganga, Yamuna and invisible Saraswati during
    Mahakumbh has special significance. It is believed that by bathing in this holy
    confluence, all the sins of a person are washed away and he attains salvation.


    Bathing not only symbolizes personal purification, but it also conveys the message
    of social harmony and unity, where people from different cultures and communities
    come together to participate in this sacred ritual.


    It is considered that in special circumstances, the water of rivers also acquires
    a special life-giving quality, i.e. nectar, which not only leads to spiritual
    development along with purification of the mind, but also gives physical benefits
    by getting health.'
  - 'The art of knitting is a fascinating hobby that allows individuals to create
    beautiful and functional pieces from yarn. By intertwining strands of wool or
    cotton, one can produce items ranging from scarves to intricate sweaters. This
    craft has been passed down through generations, often bringing family members
    together for cozy evenings filled with creativity and conversation.


    Knitting not only provides a sense of accomplishment with every completed project
    but also promotes focus and relaxation, making it an excellent activity for reducing
    stress. Furthermore, the choice of colors and patterns can result in vibrant works
    of art, showcasing the unique style and personality of the knitter. Engaging in
    this craft often leads to new friendships within community groups that gather
    to share techniques and ideas, fostering a sense of belonging among enthusiasts.'
model-index:
- name: SentenceTransformer based on BAAI/bge-small-en-v1.5
  results:
  - task:
      type: information-retrieval
      name: Information Retrieval
    dataset:
      name: val evaluator
      type: val_evaluator
    metrics:
    - type: cosine_accuracy@1
      value: 0.8156028368794326
      name: Cosine Accuracy@1
    - type: cosine_accuracy@5
      value: 0.9929078014184397
      name: Cosine Accuracy@5
    - type: cosine_accuracy@10
      value: 1.0
      name: Cosine Accuracy@10
    - type: cosine_precision@1
      value: 0.8156028368794326
      name: Cosine Precision@1
    - type: cosine_precision@5
      value: 0.1985815602836879
      name: Cosine Precision@5
    - type: cosine_precision@10
      value: 0.09999999999999999
      name: Cosine Precision@10
    - type: cosine_recall@1
      value: 0.8156028368794326
      name: Cosine Recall@1
    - type: cosine_recall@5
      value: 0.9929078014184397
      name: Cosine Recall@5
    - type: cosine_recall@10
      value: 1.0
      name: Cosine Recall@10
    - type: cosine_ndcg@5
      value: 0.9154696629317853
      name: Cosine Ndcg@5
    - type: cosine_ndcg@10
      value: 0.9179959550389344
      name: Cosine Ndcg@10
    - type: cosine_ndcg@100
      value: 0.9179959550389344
      name: Cosine Ndcg@100
    - type: cosine_mrr@5
      value: 0.8891252955082741
      name: Cosine Mrr@5
    - type: cosine_mrr@10
      value: 0.8903073286052008
      name: Cosine Mrr@10
    - type: cosine_mrr@100
      value: 0.8903073286052008
      name: Cosine Mrr@100
    - type: cosine_map@100
      value: 0.8903073286052009
      name: Cosine Map@100
    - type: dot_accuracy@1
      value: 0.8156028368794326
      name: Dot Accuracy@1
    - type: dot_accuracy@5
      value: 0.9929078014184397
      name: Dot Accuracy@5
    - type: dot_accuracy@10
      value: 1.0
      name: Dot Accuracy@10
    - type: dot_precision@1
      value: 0.8156028368794326
      name: Dot Precision@1
    - type: dot_precision@5
      value: 0.1985815602836879
      name: Dot Precision@5
    - type: dot_precision@10
      value: 0.09999999999999999
      name: Dot Precision@10
    - type: dot_recall@1
      value: 0.8156028368794326
      name: Dot Recall@1
    - type: dot_recall@5
      value: 0.9929078014184397
      name: Dot Recall@5
    - type: dot_recall@10
      value: 1.0
      name: Dot Recall@10
    - type: dot_ndcg@5
      value: 0.9154696629317853
      name: Dot Ndcg@5
    - type: dot_ndcg@10
      value: 0.9179959550389344
      name: Dot Ndcg@10
    - type: dot_ndcg@100
      value: 0.9179959550389344
      name: Dot Ndcg@100
    - type: dot_mrr@5
      value: 0.8891252955082741
      name: Dot Mrr@5
    - type: dot_mrr@10
      value: 0.8903073286052008
      name: Dot Mrr@10
    - type: dot_mrr@100
      value: 0.8903073286052008
      name: Dot Mrr@100
    - type: dot_map@100
      value: 0.8903073286052009
      name: Dot Map@100
---

# SentenceTransformer based on BAAI/bge-small-en-v1.5

This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

## Model Details

### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) <!-- at revision 5c38ec7c405ec4b44b94cc5a9bb96e735b38267a -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 384 tokens
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->

### Model Sources

- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)

### Full Model Architecture

```
SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)
```

## Usage

### Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

```bash
pip install -U sentence-transformers
```

Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("himanshu23099/bge_embedding_finetune_v3")
# Run inference
sentences = [
    'What is the ritual of Snan or bathing?',
    'Taking bath at the confluence of Ganga, Yamuna and invisible Saraswati during Mahakumbh has special significance. It is believed that by bathing in this holy confluence, all the sins of a person are washed away and he attains salvation.\n\nBathing not only symbolizes personal purification, but it also conveys the message of social harmony and unity, where people from different cultures and communities come together to participate in this sacred ritual.\n\nIt is considered that in special circumstances, the water of rivers also acquires a special life-giving quality, i.e. nectar, which not only leads to spiritual development along with purification of the mind, but also gives physical benefits by getting health.',
    'The art of knitting is a fascinating hobby that allows individuals to create beautiful and functional pieces from yarn. By intertwining strands of wool or cotton, one can produce items ranging from scarves to intricate sweaters. This craft has been passed down through generations, often bringing family members together for cozy evenings filled with creativity and conversation.\n\nKnitting not only provides a sense of accomplishment with every completed project but also promotes focus and relaxation, making it an excellent activity for reducing stress. Furthermore, the choice of colors and patterns can result in vibrant works of art, showcasing the unique style and personality of the knitter. Engaging in this craft often leads to new friendships within community groups that gather to share techniques and ideas, fostering a sense of belonging among enthusiasts.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```

<!--
### Direct Usage (Transformers)

<details><summary>Click to see the direct usage in Transformers</summary>

</details>
-->

<!--
### Downstream Usage (Sentence Transformers)

You can finetune this model on your own dataset.

<details><summary>Click to expand</summary>

</details>
-->

<!--
### Out-of-Scope Use

*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->

## Evaluation

### Metrics

#### Information Retrieval
* Dataset: `val_evaluator`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)

| Metric              | Value      |
|:--------------------|:-----------|
| cosine_accuracy@1   | 0.8156     |
| cosine_accuracy@5   | 0.9929     |
| cosine_accuracy@10  | 1.0        |
| cosine_precision@1  | 0.8156     |
| cosine_precision@5  | 0.1986     |
| cosine_precision@10 | 0.1        |
| cosine_recall@1     | 0.8156     |
| cosine_recall@5     | 0.9929     |
| cosine_recall@10    | 1.0        |
| cosine_ndcg@5       | 0.9155     |
| cosine_ndcg@10      | 0.918      |
| cosine_ndcg@100     | 0.918      |
| cosine_mrr@5        | 0.8891     |
| cosine_mrr@10       | 0.8903     |
| cosine_mrr@100      | 0.8903     |
| **cosine_map@100**  | **0.8903** |
| dot_accuracy@1      | 0.8156     |
| dot_accuracy@5      | 0.9929     |
| dot_accuracy@10     | 1.0        |
| dot_precision@1     | 0.8156     |
| dot_precision@5     | 0.1986     |
| dot_precision@10    | 0.1        |
| dot_recall@1        | 0.8156     |
| dot_recall@5        | 0.9929     |
| dot_recall@10       | 1.0        |
| dot_ndcg@5          | 0.9155     |
| dot_ndcg@10         | 0.918      |
| dot_ndcg@100        | 0.918      |
| dot_mrr@5           | 0.8891     |
| dot_mrr@10          | 0.8903     |
| dot_mrr@100         | 0.8903     |
| dot_map@100         | 0.8903     |

<!--
## Bias, Risks and Limitations

*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->

<!--
### Recommendations

*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->

## Training Details

### Training Dataset

#### Unnamed Dataset


* Size: 563 training samples
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
* Approximate statistics based on the first 563 samples:
  |         | anchor                                                                            | positive                                                                           | negative                                                                             |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             | string                                                                               |
  | details | <ul><li>min: 6 tokens</li><li>mean: 16.33 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 93.51 tokens</li><li>max: 402 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 109.62 tokens</li><li>max: 269 tokens</li></ul> |
* Samples:
  | anchor                                                                                                                  | positive                                                                                                                                                                                                                                                                      | negative                                                                                                                                                                                                                                                                                                                                                                                                                                       |
  |:------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Are there attached bathrooms in tents?</code>                                                                     | <code>Attached bathroom facilities in tents vary by vendor and tent type. To know more about the availability of attached bathrooms, please reach out to your chosen Tent City vendor.  For more information about these vendors and their services, please click here</code> | <code>The colors of the rainbow blend seamlessly across the canvas of the sky, creating a stunning visual display. Enjoying the beauty of nature can greatly enhance one's mood and inspire creativity. Take a moment to appreciate the vibrant hues and how they interact, as this can lead to a greater understanding of art and light. Exploring different forms of expression allows for personal growth and emotional exploration.</code> |
  | <code>Are there any discounts for senior citizens or children on buses traveling from the Bus Stand to the Mela?</code> | <code>No, there are no specific discounts available for senior citizens or children on buses traveling from the Bus Stand to the Mela. Standard ticket prices generally apply to all passengers.</code>                                                                       | <code>The vibrant colors of autumn leaves create a breathtaking scene as they cascade gently to the ground. Local parks become havens for photographers and nature enthusiasts alike, capturing the fleeting beauty of the season. Crisp air invigorates leisurely strolls, while children gather acorns and pinecones, crafting treasures from nature’s bounty.</code>                                                                        |
  | <code>Are there any luggage porter services available at Prayagraj Junction for pilgrims heading to the Mela?</code>    | <code>Yes, luggage porter services are available at Prayagraj Junction for pilgrims heading to the Mela. These porters, often referred to as coolies</code>                                                                                                                   | <code> can be hired directly at the station to assist with carrying luggage from the train platform to your onward transport or directly to the Mela area.</code>                                                                                                                                                                                                                                                                              |
* Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.01}
  ```

### Evaluation Dataset

#### Unnamed Dataset


* Size: 141 evaluation samples
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
* Approximate statistics based on the first 141 samples:
  |         | anchor                                                                            | positive                                                                           | negative                                                                             |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             | string                                                                               |
  | details | <ul><li>min: 6 tokens</li><li>mean: 16.05 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 88.91 tokens</li><li>max: 324 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 104.84 tokens</li><li>max: 262 tokens</li></ul> |
* Samples:
  | anchor                                                                                                             | positive                                                                                                                                                                                                                                                                                                                                                                                        | negative                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                              |
  |:-------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>What family-friendly tours are available?</code>                                                             | <code>All tours are designed with families in mind, ensuring a safe, comfortable, and enjoyable experience for all age groups. Whether traveling with children or elderly family members, the tours are structured to accommodate the needs of everyone in the group.<br><br>Specific tours for senior citizens are also available. To explore them, click here : https://bit.ly/4eWFRoH</code> | <code>The majestic mountains rise against the azure sky, their peaks adorned with glistening snow that sparkles in the sunlight. deep valleys shelter hidden waterfalls, where crystal-clear waters cascade gracefully over rocks, creating a tranquil sound reverberating through the lush landscape. Wildlife thrives here, and one may spot elusive deer grazing in the early morning mist. As dusk settles, the horizon transforms into a canvas of vibrant hues, painting a breathtaking sunset that captivates the soul. Each season unveils unique beauty, inviting adventurers to explore its wonders.</code> |
  | <code>What are the charges for a private taxi or cab from Prayagraj Airport to the Mela grounds?</code>            | <code>Private taxi charges are not fixed</code>                                                                                                                                                                                                                                                                                                                                                 | <code>The garden blooms vibrantly with colors and fragrances that attract butterflies and bees. Each petal holds a story from the earth, whispering tales of growth and resilience. Nearby, a small pond reflects the blue sky, while frogs leap joyfully on lily pads, creating ripples that dance across the surface. The sound of rustling leaves accompanies the gentle breeze, making nature's symphony a soothing backdrop for all who pause and appreciate this serene setting. As the sun sets, golden hues envelop the scene, inviting evening creatures to awaken under the twilight.</code>                |
  | <code>What are the options for traveling to the Kumbh Mela if I arrive late at night at Prayagraj Junction?</code> | <code>If you arrive late at night at Prayagraj Junction for the Kumbh Mela, you have majorly 2 options for travel. <br><br>1. Taxi/Cabs: You can easily find 24/7 taxi services outside the railway station. Prepaid taxis are the most convenient and safe option.<br><br>2. Auto Rickshaws:Auto rickshaws are readily available outside the railway station.</code>                           | <code>The blooming desert blooms with vibrant colors as dusk approaches. Amidst the sands, ancient stories whisper through the wind, recalling journeys of nomads who tread lightly upon the earth. Some dance beneath the starlit skies, celebrating the beauty of freedom and the vastness of their surroundings. The nocturnal creatures awaken, each sound echoing tales of survival and adventure. Beyond the horizon, a tapestry of dreams unfurls, where every grain of sand holds the promise of a new discovery waiting to be unveiled.</code>                                                               |
* Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.01}
  ```

### Training Hyperparameters
#### Non-Default Hyperparameters

- `eval_strategy`: steps
- `per_device_train_batch_size`: 16
- `gradient_accumulation_steps`: 2
- `learning_rate`: 1e-05
- `weight_decay`: 0.01
- `num_train_epochs`: 90
- `warmup_ratio`: 0.1
- `load_best_model_at_end`: True

#### All Hyperparameters
<details><summary>Click to expand</summary>

- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 8
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 2
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 1e-05
- `weight_decay`: 0.01
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 90
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`: 
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `eval_use_gather_object`: False
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional

</details>

### Training Logs
<details><summary>Click to expand</summary>

| Epoch       | Step     | Training Loss | Validation Loss | val_evaluator_cosine_map@100 |
|:-----------:|:--------:|:-------------:|:---------------:|:----------------------------:|
| 0.5556      | 10       | 0.9623        | 0.5803          | 0.7676                       |
| 1.1111      | 20       | 0.8653        | 0.5278          | 0.7684                       |
| 1.6667      | 30       | 0.9346        | 0.4556          | 0.7692                       |
| 2.2222      | 40       | 0.8058        | 0.3928          | 0.7687                       |
| 2.7778      | 50       | 0.6639        | 0.3282          | 0.7723                       |
| 3.3333      | 60       | 0.4974        | 0.2657          | 0.7784                       |
| 3.8889      | 70       | 0.4447        | 0.2130          | 0.7877                       |
| 4.4444      | 80       | 0.4309        | 0.1753          | 0.7922                       |
| 5.0         | 90       | 0.2755        | 0.1320          | 0.7951                       |
| 5.5556      | 100      | 0.3105        | 0.0826          | 0.8029                       |
| 6.1111      | 110      | 0.1539        | 0.0479          | 0.8106                       |
| 6.6667      | 120      | 0.22          | 0.0312          | 0.8141                       |
| 7.2222      | 130      | 0.235         | 0.0173          | 0.8245                       |
| 7.7778      | 140      | 0.1517        | 0.0119          | 0.8257                       |
| 8.3333      | 150      | 0.1328        | 0.0095          | 0.8311                       |
| 8.8889      | 160      | 0.1175        | 0.0055          | 0.8319                       |
| 9.4444      | 170      | 0.1178        | 0.0037          | 0.8308                       |
| 10.0        | 180      | 0.0598        | 0.0034          | 0.8338                       |
| 10.5556     | 190      | 0.0958        | 0.0030          | 0.8324                       |
| 11.1111     | 200      | 0.0681        | 0.0019          | 0.8331                       |
| 11.6667     | 210      | 0.069         | 0.0013          | 0.8406                       |
| 12.2222     | 220      | 0.0327        | 0.0009          | 0.8522                       |
| 12.7778     | 230      | 0.0833        | 0.0006          | 0.8589                       |
| 13.3333     | 240      | 0.0806        | 0.0005          | 0.8596                       |
| 13.8889     | 250      | 0.0714        | 0.0004          | 0.8658                       |
| 14.4444     | 260      | 0.0813        | 0.0004          | 0.8659                       |
| 15.0        | 270      | 0.0512        | 0.0003          | 0.8676                       |
| 15.5556     | 280      | 0.043         | 0.0003          | 0.8677                       |
| 16.1111     | 290      | 0.0526        | 0.0003          | 0.8677                       |
| 16.6667     | 300      | 0.0291        | 0.0002          | 0.8651                       |
| 17.2222     | 310      | 0.0487        | 0.0002          | 0.8662                       |
| 17.7778     | 320      | 0.054         | 0.0002          | 0.8621                       |
| 18.3333     | 330      | 0.067         | 0.0002          | 0.8652                       |
| 18.8889     | 340      | 0.0415        | 0.0002          | 0.8652                       |
| 19.4444     | 350      | 0.0484        | 0.0002          | 0.8652                       |
| 20.0        | 360      | 0.0304        | 0.0002          | 0.8690                       |
| 20.5556     | 370      | 0.025         | 0.0002          | 0.8697                       |
| 21.1111     | 380      | 0.0549        | 0.0002          | 0.8697                       |
| 21.6667     | 390      | 0.0375        | 0.0002          | 0.8736                       |
| 22.2222     | 400      | 0.0293        | 0.0002          | 0.8749                       |
| 22.7778     | 410      | 0.0558        | 0.0002          | 0.8728                       |
| 23.3333     | 420      | 0.0458        | 0.0002          | 0.8730                       |
| 23.8889     | 430      | 0.0235        | 0.0002          | 0.8730                       |
| 24.4444     | 440      | 0.0515        | 0.0002          | 0.8730                       |
| 25.0        | 450      | 0.0337        | 0.0002          | 0.8734                       |
| 25.5556     | 460      | 0.0376        | 0.0002          | 0.8734                       |
| 26.1111     | 470      | 0.0189        | 0.0003          | 0.8734                       |
| 26.6667     | 480      | 0.032         | 0.0002          | 0.8734                       |
| 27.2222     | 490      | 0.025         | 0.0002          | 0.8695                       |
| 27.7778     | 500      | 0.0258        | 0.0002          | 0.8704                       |
| 28.3333     | 510      | 0.0351        | 0.0002          | 0.8681                       |
| 28.8889     | 520      | 0.0285        | 0.0002          | 0.8679                       |
| 29.4444     | 530      | 0.0263        | 0.0002          | 0.8679                       |
| 30.0        | 540      | 0.0901        | 0.0002          | 0.8679                       |
| 30.5556     | 550      | 0.0323        | 0.0001          | 0.8686                       |
| 31.1111     | 560      | 0.0406        | 0.0001          | 0.8728                       |
| 31.6667     | 570      | 0.0302        | 0.0001          | 0.8712                       |
| 32.2222     | 580      | 0.0195        | 0.0001          | 0.8718                       |
| 32.7778     | 590      | 0.0665        | 0.0001          | 0.8718                       |
| 33.3333     | 600      | 0.0153        | 0.0001          | 0.8728                       |
| 33.8889     | 610      | 0.0378        | 0.0001          | 0.8728                       |
| 34.4444     | 620      | 0.0369        | 0.0001          | 0.8763                       |
| 35.0        | 630      | 0.0238        | 0.0001          | 0.8706                       |
| 35.5556     | 640      | 0.0275        | 0.0001          | 0.8720                       |
| 36.1111     | 650      | 0.0469        | 0.0001          | 0.8708                       |
| 36.6667     | 660      | 0.0438        | 0.0001          | 0.8788                       |
| 37.2222     | 670      | 0.0333        | 0.0001          | 0.8800                       |
| 37.7778     | 680      | 0.0186        | 0.0001          | 0.8765                       |
| 38.3333     | 690      | 0.0308        | 0.0001          | 0.8765                       |
| 38.8889     | 700      | 0.0713        | 0.0001          | 0.8767                       |
| 39.4444     | 710      | 0.0188        | 0.0001          | 0.8767                       |
| 40.0        | 720      | 0.0205        | 0.0001          | 0.8767                       |
| 40.5556     | 730      | 0.0261        | 0.0001          | 0.8767                       |
| 41.1111     | 740      | 0.0193        | 0.0001          | 0.8755                       |
| 41.6667     | 750      | 0.0367        | 0.0000          | 0.8755                       |
| 42.2222     | 760      | 0.0515        | 0.0000          | 0.8755                       |
| 42.7778     | 770      | 0.0649        | 0.0000          | 0.8844                       |
| 43.3333     | 780      | 0.0333        | 0.0000          | 0.8879                       |
| 43.8889     | 790      | 0.0498        | 0.0000          | 0.8868                       |
| 44.4444     | 800      | 0.0324        | 0.0000          | 0.8832                       |
| 45.0        | 810      | 0.0321        | 0.0000          | 0.8832                       |
| 45.5556     | 820      | 0.0354        | 0.0000          | 0.8832                       |
| 46.1111     | 830      | 0.04          | 0.0000          | 0.8868                       |
| 46.6667     | 840      | 0.0176        | 0.0000          | 0.8868                       |
| 47.2222     | 850      | 0.0297        | 0.0000          | 0.8868                       |
| 47.7778     | 860      | 0.0469        | 0.0000          | 0.8868                       |
| 48.3333     | 870      | 0.025         | 0.0000          | 0.8868                       |
| 48.8889     | 880      | 0.0425        | 0.0000          | 0.8868                       |
| 49.4444     | 890      | 0.0475        | 0.0000          | 0.8868                       |
| 50.0        | 900      | 0.0529        | 0.0000          | 0.8868                       |
| 50.5556     | 910      | 0.0312        | 0.0000          | 0.8868                       |
| 51.1111     | 920      | 0.0385        | 0.0000          | 0.8832                       |
| 51.6667     | 930      | 0.0316        | 0.0000          | 0.8832                       |
| 52.2222     | 940      | 0.0361        | 0.0000          | 0.8832                       |
| 52.7778     | 950      | 0.053         | 0.0000          | 0.8832                       |
| 53.3333     | 960      | 0.0226        | 0.0000          | 0.8868                       |
| 53.8889     | 970      | 0.0781        | 0.0000          | 0.8868                       |
| 54.4444     | 980      | 0.03          | 0.0000          | 0.8868                       |
| 55.0        | 990      | 0.0349        | 0.0000          | 0.8832                       |
| 55.5556     | 1000     | 0.0539        | 0.0000          | 0.8832                       |
| 56.1111     | 1010     | 0.0351        | 0.0000          | 0.8832                       |
| 56.6667     | 1020     | 0.0506        | 0.0000          | 0.8832                       |
| 57.2222     | 1030     | 0.0204        | 0.0000          | 0.8832                       |
| 57.7778     | 1040     | 0.0254        | 0.0000          | 0.8844                       |
| 58.3333     | 1050     | 0.0274        | 0.0000          | 0.8844                       |
| 58.8889     | 1060     | 0.001         | 0.0000          | 0.8844                       |
| 59.4444     | 1070     | 0.049         | 0.0000          | 0.8844                       |
| 60.0        | 1080     | 0.028         | 0.0000          | 0.8844                       |
| 60.5556     | 1090     | 0.0477        | 0.0000          | 0.8844                       |
| 61.1111     | 1100     | 0.0304        | 0.0000          | 0.8844                       |
| 61.6667     | 1110     | 0.0188        | 0.0000          | 0.8844                       |
| 62.2222     | 1120     | 0.0247        | 0.0000          | 0.8879                       |
| 62.7778     | 1130     | 0.0428        | 0.0000          | 0.8879                       |
| 63.3333     | 1140     | 0.0218        | 0.0000          | 0.8879                       |
| 63.8889     | 1150     | 0.0476        | 0.0000          | 0.8868                       |
| 64.4444     | 1160     | 0.021         | 0.0000          | 0.8868                       |
| 65.0        | 1170     | 0.0435        | 0.0000          | 0.8856                       |
| 65.5556     | 1180     | 0.0311        | 0.0000          | 0.8856                       |
| 66.1111     | 1190     | 0.0275        | 0.0000          | 0.8856                       |
| 66.6667     | 1200     | 0.0405        | 0.0000          | 0.8891                       |
| 67.2222     | 1210     | 0.0009        | 0.0000          | 0.8891                       |
| 67.7778     | 1220     | 0.0506        | 0.0000          | 0.8891                       |
| 68.3333     | 1230     | 0.0538        | 0.0000          | 0.8891                       |
| 68.8889     | 1240     | 0.0251        | 0.0000          | 0.8891                       |
| 69.4444     | 1250     | 0.0168        | 0.0000          | 0.8891                       |
| 70.0        | 1260     | 0.0527        | 0.0000          | 0.8903                       |
| 70.5556     | 1270     | 0.0491        | 0.0000          | 0.8903                       |
| 71.1111     | 1280     | 0.0092        | 0.0000          | 0.8903                       |
| 71.6667     | 1290     | 0.0257        | 0.0000          | 0.8903                       |
| **72.2222** | **1300** | **0.0455**    | **0.0**         | **0.8903**                   |
| 72.7778     | 1310     | 0.0271        | 0.0000          | 0.8903                       |
| 73.3333     | 1320     | 0.04          | 0.0000          | 0.8903                       |
| 73.8889     | 1330     | 0.0171        | 0.0000          | 0.8903                       |
| 74.4444     | 1340     | 0.0157        | 0.0000          | 0.8903                       |
| 75.0        | 1350     | 0.0323        | 0.0000          | 0.8903                       |
| 75.5556     | 1360     | 0.0372        | 0.0000          | 0.8903                       |
| 76.1111     | 1370     | 0.0109        | 0.0000          | 0.8903                       |
| 76.6667     | 1380     | 0.0358        | 0.0000          | 0.8903                       |
| 77.2222     | 1390     | 0.0279        | 0.0000          | 0.8903                       |
| 77.7778     | 1400     | 0.0173        | 0.0000          | 0.8903                       |
| 78.3333     | 1410     | 0.0409        | 0.0000          | 0.8903                       |
| 78.8889     | 1420     | 0.0139        | 0.0000          | 0.8903                       |
| 79.4444     | 1430     | 0.0123        | 0.0000          | 0.8903                       |
| 80.0        | 1440     | 0.0232        | 0.0000          | 0.8903                       |
| 80.5556     | 1450     | 0.0145        | 0.0000          | 0.8903                       |
| 81.1111     | 1460     | 0.0261        | 0.0000          | 0.8903                       |
| 81.6667     | 1470     | 0.0137        | 0.0000          | 0.8903                       |
| 82.2222     | 1480     | 0.0146        | 0.0000          | 0.8903                       |
| 82.7778     | 1490     | 0.0096        | 0.0000          | 0.8903                       |
| 83.3333     | 1500     | 0.0245        | 0.0000          | 0.8903                       |
| 83.8889     | 1510     | 0.0312        | 0.0000          | 0.8903                       |
| 84.4444     | 1520     | 0.0174        | 0.0000          | 0.8903                       |
| 85.0        | 1530     | 0.0437        | 0.0000          | 0.8903                       |
| 85.5556     | 1540     | 0.0301        | 0.0000          | 0.8903                       |
| 86.1111     | 1550     | 0.0119        | 0.0000          | 0.8903                       |
| 86.6667     | 1560     | 0.0554        | 0.0000          | 0.8903                       |
| 87.2222     | 1570     | 0.021         | 0.0000          | 0.8903                       |
| 87.7778     | 1580     | 0.029         | 0.0000          | 0.8903                       |
| 88.3333     | 1590     | 0.0132        | 0.0000          | 0.8903                       |
| 88.8889     | 1600     | 0.0339        | 0.0000          | 0.8903                       |
| 89.4444     | 1610     | 0.0412        | 0.0000          | 0.8903                       |
| 90.0        | 1620     | 0.0847        | 0.0000          | 0.8903                       |

* The bold row denotes the saved checkpoint.
</details>

### Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.2.1
- Transformers: 4.44.2
- PyTorch: 2.5.0+cu121
- Accelerate: 0.34.2
- Datasets: 3.1.0
- Tokenizers: 0.19.1

## Citation

### BibTeX

#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
```

#### GISTEmbedLoss
```bibtex
@misc{solatorio2024gistembed,
    title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
    author={Aivin V. Solatorio},
    year={2024},
    eprint={2402.16829},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}
```

<!--
## Glossary

*Clearly define terms in order to be accessible across audiences.*
-->

<!--
## Model Card Authors

*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->

<!--
## Model Card Contact

*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->