Files changed (1) hide show
  1. README.md +1699 -0
README.md CHANGED
@@ -4,6 +4,1705 @@ tags:
4
  - sentence-transformers
5
  - feature-extraction
6
  - sentence-similarity
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7
  ---
8
 
9
  # clip-ViT-B-32
 
4
  - sentence-transformers
5
  - feature-extraction
6
  - sentence-similarity
7
+ - mteb
8
+ model-index:
9
+ - name: clip-ViT-B-32
10
+ results:
11
+ - task:
12
+ type: Classification
13
+ dataset:
14
+ type: mteb/amazon_counterfactual
15
+ name: MTEB AmazonCounterfactualClassification (en)
16
+ config: en
17
+ split: test
18
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
19
+ metrics:
20
+ - type: accuracy
21
+ value: 57.999999999999986
22
+ - type: ap
23
+ value: 23.966099106216358
24
+ - type: f1
25
+ value: 52.8203944454417
26
+ - task:
27
+ type: Classification
28
+ dataset:
29
+ type: mteb/amazon_polarity
30
+ name: MTEB AmazonPolarityClassification
31
+ config: default
32
+ split: test
33
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
34
+ metrics:
35
+ - type: accuracy
36
+ value: 62.366
37
+ - type: ap
38
+ value: 57.98090324593318
39
+ - type: f1
40
+ value: 61.62762218315074
41
+ - task:
42
+ type: Classification
43
+ dataset:
44
+ type: mteb/amazon_reviews_multi
45
+ name: MTEB AmazonReviewsClassification (en)
46
+ config: en
47
+ split: test
48
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
49
+ metrics:
50
+ - type: accuracy
51
+ value: 28.584
52
+ - type: f1
53
+ value: 28.463306116150783
54
+ - task:
55
+ type: Retrieval
56
+ dataset:
57
+ type: arguana
58
+ name: MTEB ArguAna
59
+ config: default
60
+ split: test
61
+ revision: None
62
+ metrics:
63
+ - type: map_at_1
64
+ value: 6.259
65
+ - type: map_at_10
66
+ value: 11.542
67
+ - type: map_at_100
68
+ value: 12.859000000000002
69
+ - type: map_at_1000
70
+ value: 12.966
71
+ - type: map_at_3
72
+ value: 9.128
73
+ - type: map_at_5
74
+ value: 10.262
75
+ - type: mrr_at_1
76
+ value: 6.259
77
+ - type: mrr_at_10
78
+ value: 11.536
79
+ - type: mrr_at_100
80
+ value: 12.859000000000002
81
+ - type: mrr_at_1000
82
+ value: 12.967
83
+ - type: mrr_at_3
84
+ value: 9.128
85
+ - type: mrr_at_5
86
+ value: 10.262
87
+ - type: ndcg_at_1
88
+ value: 6.259
89
+ - type: ndcg_at_10
90
+ value: 15.35
91
+ - type: ndcg_at_100
92
+ value: 22.107
93
+ - type: ndcg_at_1000
94
+ value: 25.355
95
+ - type: ndcg_at_3
96
+ value: 10.172
97
+ - type: ndcg_at_5
98
+ value: 12.22
99
+ - type: precision_at_1
100
+ value: 6.259
101
+ - type: precision_at_10
102
+ value: 2.795
103
+ - type: precision_at_100
104
+ value: 0.603
105
+ - type: precision_at_1000
106
+ value: 0.087
107
+ - type: precision_at_3
108
+ value: 4.41
109
+ - type: precision_at_5
110
+ value: 3.642
111
+ - type: recall_at_1
112
+ value: 6.259
113
+ - type: recall_at_10
114
+ value: 27.951999999999998
115
+ - type: recall_at_100
116
+ value: 60.313
117
+ - type: recall_at_1000
118
+ value: 86.771
119
+ - type: recall_at_3
120
+ value: 13.229
121
+ - type: recall_at_5
122
+ value: 18.208
123
+ - task:
124
+ type: Clustering
125
+ dataset:
126
+ type: mteb/arxiv-clustering-p2p
127
+ name: MTEB ArxivClusteringP2P
128
+ config: default
129
+ split: test
130
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
131
+ metrics:
132
+ - type: v_measure
133
+ value: 30.95753257205936
134
+ - task:
135
+ type: Clustering
136
+ dataset:
137
+ type: mteb/arxiv-clustering-s2s
138
+ name: MTEB ArxivClusteringS2S
139
+ config: default
140
+ split: test
141
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
142
+ metrics:
143
+ - type: v_measure
144
+ value: 26.586511396557583
145
+ - task:
146
+ type: Reranking
147
+ dataset:
148
+ type: mteb/askubuntudupquestions-reranking
149
+ name: MTEB AskUbuntuDupQuestions
150
+ config: default
151
+ split: test
152
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
153
+ metrics:
154
+ - type: map
155
+ value: 51.090393666506415
156
+ - type: mrr
157
+ value: 65.19412566503979
158
+ - task:
159
+ type: STS
160
+ dataset:
161
+ type: mteb/biosses-sts
162
+ name: MTEB BIOSSES
163
+ config: default
164
+ split: test
165
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
166
+ metrics:
167
+ - type: cos_sim_pearson
168
+ value: 69.9163188743249
169
+ - type: cos_sim_spearman
170
+ value: 64.1345938803495
171
+ - type: euclidean_pearson
172
+ value: 67.36703723549599
173
+ - type: euclidean_spearman
174
+ value: 63.067702100617005
175
+ - type: manhattan_pearson
176
+ value: 71.6901307580259
177
+ - type: manhattan_spearman
178
+ value: 67.04128661733944
179
+ - task:
180
+ type: Classification
181
+ dataset:
182
+ type: mteb/banking77
183
+ name: MTEB Banking77Classification
184
+ config: default
185
+ split: test
186
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
187
+ metrics:
188
+ - type: accuracy
189
+ value: 73.22402597402598
190
+ - type: f1
191
+ value: 73.12739303105114
192
+ - task:
193
+ type: Clustering
194
+ dataset:
195
+ type: mteb/biorxiv-clustering-p2p
196
+ name: MTEB BiorxivClusteringP2P
197
+ config: default
198
+ split: test
199
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
200
+ metrics:
201
+ - type: v_measure
202
+ value: 28.97385566120484
203
+ - task:
204
+ type: Clustering
205
+ dataset:
206
+ type: mteb/biorxiv-clustering-s2s
207
+ name: MTEB BiorxivClusteringS2S
208
+ config: default
209
+ split: test
210
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
211
+ metrics:
212
+ - type: v_measure
213
+ value: 27.08579813861177
214
+ - task:
215
+ type: Retrieval
216
+ dataset:
217
+ type: BeIR/cqadupstack
218
+ name: MTEB CQADupstackAndroidRetrieval
219
+ config: default
220
+ split: test
221
+ revision: None
222
+ metrics:
223
+ - type: map_at_1
224
+ value: 7.106999999999999
225
+ - type: map_at_10
226
+ value: 11.797
227
+ - type: map_at_100
228
+ value: 12.6
229
+ - type: map_at_1000
230
+ value: 12.711
231
+ - type: map_at_3
232
+ value: 10.369
233
+ - type: map_at_5
234
+ value: 10.881
235
+ - type: mrr_at_1
236
+ value: 9.299
237
+ - type: mrr_at_10
238
+ value: 15.076
239
+ - type: mrr_at_100
240
+ value: 15.842
241
+ - type: mrr_at_1000
242
+ value: 15.928
243
+ - type: mrr_at_3
244
+ value: 13.4
245
+ - type: mrr_at_5
246
+ value: 14.044
247
+ - type: ndcg_at_1
248
+ value: 9.299
249
+ - type: ndcg_at_10
250
+ value: 15.21
251
+ - type: ndcg_at_100
252
+ value: 19.374
253
+ - type: ndcg_at_1000
254
+ value: 22.527
255
+ - type: ndcg_at_3
256
+ value: 12.383
257
+ - type: ndcg_at_5
258
+ value: 13.096
259
+ - type: precision_at_1
260
+ value: 9.299
261
+ - type: precision_at_10
262
+ value: 3.1620000000000004
263
+ - type: precision_at_100
264
+ value: 0.662
265
+ - type: precision_at_1000
266
+ value: 0.11800000000000001
267
+ - type: precision_at_3
268
+ value: 6.3420000000000005
269
+ - type: precision_at_5
270
+ value: 4.492
271
+ - type: recall_at_1
272
+ value: 7.106999999999999
273
+ - type: recall_at_10
274
+ value: 22.544
275
+ - type: recall_at_100
276
+ value: 41.002
277
+ - type: recall_at_1000
278
+ value: 63.67699999999999
279
+ - type: recall_at_3
280
+ value: 14.316999999999998
281
+ - type: recall_at_5
282
+ value: 16.367
283
+ - task:
284
+ type: Retrieval
285
+ dataset:
286
+ type: BeIR/cqadupstack
287
+ name: MTEB CQADupstackEnglishRetrieval
288
+ config: default
289
+ split: test
290
+ revision: None
291
+ metrics:
292
+ - type: map_at_1
293
+ value: 6.632000000000001
294
+ - type: map_at_10
295
+ value: 9.067
296
+ - type: map_at_100
297
+ value: 9.487
298
+ - type: map_at_1000
299
+ value: 9.563
300
+ - type: map_at_3
301
+ value: 8.344999999999999
302
+ - type: map_at_5
303
+ value: 8.742999999999999
304
+ - type: mrr_at_1
305
+ value: 8.599
306
+ - type: mrr_at_10
307
+ value: 11.332
308
+ - type: mrr_at_100
309
+ value: 11.77
310
+ - type: mrr_at_1000
311
+ value: 11.843
312
+ - type: mrr_at_3
313
+ value: 10.478
314
+ - type: mrr_at_5
315
+ value: 10.959000000000001
316
+ - type: ndcg_at_1
317
+ value: 8.599
318
+ - type: ndcg_at_10
319
+ value: 10.843
320
+ - type: ndcg_at_100
321
+ value: 13.023000000000001
322
+ - type: ndcg_at_1000
323
+ value: 15.409
324
+ - type: ndcg_at_3
325
+ value: 9.673
326
+ - type: ndcg_at_5
327
+ value: 10.188
328
+ - type: precision_at_1
329
+ value: 8.599
330
+ - type: precision_at_10
331
+ value: 2.038
332
+ - type: precision_at_100
333
+ value: 0.383
334
+ - type: precision_at_1000
335
+ value: 0.074
336
+ - type: precision_at_3
337
+ value: 4.756
338
+ - type: precision_at_5
339
+ value: 3.3890000000000002
340
+ - type: recall_at_1
341
+ value: 6.632000000000001
342
+ - type: recall_at_10
343
+ value: 13.952
344
+ - type: recall_at_100
345
+ value: 23.966
346
+ - type: recall_at_1000
347
+ value: 41.411
348
+ - type: recall_at_3
349
+ value: 10.224
350
+ - type: recall_at_5
351
+ value: 11.799
352
+ - task:
353
+ type: Retrieval
354
+ dataset:
355
+ type: BeIR/cqadupstack
356
+ name: MTEB CQADupstackGamingRetrieval
357
+ config: default
358
+ split: test
359
+ revision: None
360
+ metrics:
361
+ - type: map_at_1
362
+ value: 11.153
363
+ - type: map_at_10
364
+ value: 15.751000000000001
365
+ - type: map_at_100
366
+ value: 16.464000000000002
367
+ - type: map_at_1000
368
+ value: 16.561
369
+ - type: map_at_3
370
+ value: 14.552000000000001
371
+ - type: map_at_5
372
+ value: 15.136
373
+ - type: mrr_at_1
374
+ value: 13.041
375
+ - type: mrr_at_10
376
+ value: 17.777
377
+ - type: mrr_at_100
378
+ value: 18.427
379
+ - type: mrr_at_1000
380
+ value: 18.504
381
+ - type: mrr_at_3
382
+ value: 16.479
383
+ - type: mrr_at_5
384
+ value: 17.175
385
+ - type: ndcg_at_1
386
+ value: 13.041
387
+ - type: ndcg_at_10
388
+ value: 18.581
389
+ - type: ndcg_at_100
390
+ value: 22.174
391
+ - type: ndcg_at_1000
392
+ value: 24.795
393
+ - type: ndcg_at_3
394
+ value: 16.185
395
+ - type: ndcg_at_5
396
+ value: 17.183
397
+ - type: precision_at_1
398
+ value: 13.041
399
+ - type: precision_at_10
400
+ value: 3.2230000000000003
401
+ - type: precision_at_100
402
+ value: 0.557
403
+ - type: precision_at_1000
404
+ value: 0.086
405
+ - type: precision_at_3
406
+ value: 7.544
407
+ - type: precision_at_5
408
+ value: 5.279
409
+ - type: recall_at_1
410
+ value: 11.153
411
+ - type: recall_at_10
412
+ value: 25.052999999999997
413
+ - type: recall_at_100
414
+ value: 41.521
415
+ - type: recall_at_1000
416
+ value: 61.138000000000005
417
+ - type: recall_at_3
418
+ value: 18.673000000000002
419
+ - type: recall_at_5
420
+ value: 20.964
421
+ - task:
422
+ type: Retrieval
423
+ dataset:
424
+ type: BeIR/cqadupstack
425
+ name: MTEB CQADupstackGisRetrieval
426
+ config: default
427
+ split: test
428
+ revision: None
429
+ metrics:
430
+ - type: map_at_1
431
+ value: 5.303
432
+ - type: map_at_10
433
+ value: 7.649
434
+ - type: map_at_100
435
+ value: 7.983
436
+ - type: map_at_1000
437
+ value: 8.067
438
+ - type: map_at_3
439
+ value: 6.938
440
+ - type: map_at_5
441
+ value: 7.259
442
+ - type: mrr_at_1
443
+ value: 5.763
444
+ - type: mrr_at_10
445
+ value: 8.277
446
+ - type: mrr_at_100
447
+ value: 8.665000000000001
448
+ - type: mrr_at_1000
449
+ value: 8.747
450
+ - type: mrr_at_3
451
+ value: 7.457999999999999
452
+ - type: mrr_at_5
453
+ value: 7.808
454
+ - type: ndcg_at_1
455
+ value: 5.763
456
+ - type: ndcg_at_10
457
+ value: 9.1
458
+ - type: ndcg_at_100
459
+ value: 11.253
460
+ - type: ndcg_at_1000
461
+ value: 13.847999999999999
462
+ - type: ndcg_at_3
463
+ value: 7.521999999999999
464
+ - type: ndcg_at_5
465
+ value: 8.094
466
+ - type: precision_at_1
467
+ value: 5.763
468
+ - type: precision_at_10
469
+ value: 1.514
470
+ - type: precision_at_100
471
+ value: 0.28700000000000003
472
+ - type: precision_at_1000
473
+ value: 0.054
474
+ - type: precision_at_3
475
+ value: 3.277
476
+ - type: precision_at_5
477
+ value: 2.282
478
+ - type: recall_at_1
479
+ value: 5.303
480
+ - type: recall_at_10
481
+ value: 13.126
482
+ - type: recall_at_100
483
+ value: 23.855
484
+ - type: recall_at_1000
485
+ value: 44.417
486
+ - type: recall_at_3
487
+ value: 8.556
488
+ - type: recall_at_5
489
+ value: 10.006
490
+ - task:
491
+ type: Retrieval
492
+ dataset:
493
+ type: BeIR/cqadupstack
494
+ name: MTEB CQADupstackMathematicaRetrieval
495
+ config: default
496
+ split: test
497
+ revision: None
498
+ metrics:
499
+ - type: map_at_1
500
+ value: 2.153
501
+ - type: map_at_10
502
+ value: 3.447
503
+ - type: map_at_100
504
+ value: 3.73
505
+ - type: map_at_1000
506
+ value: 3.8219999999999996
507
+ - type: map_at_3
508
+ value: 3.0269999999999997
509
+ - type: map_at_5
510
+ value: 3.283
511
+ - type: mrr_at_1
512
+ value: 2.612
513
+ - type: mrr_at_10
514
+ value: 4.289
515
+ - type: mrr_at_100
516
+ value: 4.6080000000000005
517
+ - type: mrr_at_1000
518
+ value: 4.713
519
+ - type: mrr_at_3
520
+ value: 3.669
521
+ - type: mrr_at_5
522
+ value: 4.005
523
+ - type: ndcg_at_1
524
+ value: 2.612
525
+ - type: ndcg_at_10
526
+ value: 4.422000000000001
527
+ - type: ndcg_at_100
528
+ value: 6.15
529
+ - type: ndcg_at_1000
530
+ value: 9.25
531
+ - type: ndcg_at_3
532
+ value: 3.486
533
+ - type: ndcg_at_5
534
+ value: 3.95
535
+ - type: precision_at_1
536
+ value: 2.612
537
+ - type: precision_at_10
538
+ value: 0.8829999999999999
539
+ - type: precision_at_100
540
+ value: 0.211
541
+ - type: precision_at_1000
542
+ value: 0.059000000000000004
543
+ - type: precision_at_3
544
+ value: 1.6580000000000001
545
+ - type: precision_at_5
546
+ value: 1.294
547
+ - type: recall_at_1
548
+ value: 2.153
549
+ - type: recall_at_10
550
+ value: 6.607
551
+ - type: recall_at_100
552
+ value: 14.707
553
+ - type: recall_at_1000
554
+ value: 37.99
555
+ - type: recall_at_3
556
+ value: 4.122
557
+ - type: recall_at_5
558
+ value: 5.241
559
+ - task:
560
+ type: Retrieval
561
+ dataset:
562
+ type: BeIR/cqadupstack
563
+ name: MTEB CQADupstackPhysicsRetrieval
564
+ config: default
565
+ split: test
566
+ revision: None
567
+ metrics:
568
+ - type: map_at_1
569
+ value: 7.976999999999999
570
+ - type: map_at_10
571
+ value: 11.745
572
+ - type: map_at_100
573
+ value: 12.427000000000001
574
+ - type: map_at_1000
575
+ value: 12.528
576
+ - type: map_at_3
577
+ value: 10.478
578
+ - type: map_at_5
579
+ value: 11.224
580
+ - type: mrr_at_1
581
+ value: 9.432
582
+ - type: mrr_at_10
583
+ value: 14.021
584
+ - type: mrr_at_100
585
+ value: 14.734
586
+ - type: mrr_at_1000
587
+ value: 14.813
588
+ - type: mrr_at_3
589
+ value: 12.576
590
+ - type: mrr_at_5
591
+ value: 13.414000000000001
592
+ - type: ndcg_at_1
593
+ value: 9.432
594
+ - type: ndcg_at_10
595
+ value: 14.341000000000001
596
+ - type: ndcg_at_100
597
+ value: 18.168
598
+ - type: ndcg_at_1000
599
+ value: 21.129
600
+ - type: ndcg_at_3
601
+ value: 11.909
602
+ - type: ndcg_at_5
603
+ value: 13.139999999999999
604
+ - type: precision_at_1
605
+ value: 9.432
606
+ - type: precision_at_10
607
+ value: 2.6759999999999997
608
+ - type: precision_at_100
609
+ value: 0.563
610
+ - type: precision_at_1000
611
+ value: 0.098
612
+ - type: precision_at_3
613
+ value: 5.679
614
+ - type: precision_at_5
615
+ value: 4.216
616
+ - type: recall_at_1
617
+ value: 7.976999999999999
618
+ - type: recall_at_10
619
+ value: 19.983999999999998
620
+ - type: recall_at_100
621
+ value: 37.181
622
+ - type: recall_at_1000
623
+ value: 58.714999999999996
624
+ - type: recall_at_3
625
+ value: 13.375
626
+ - type: recall_at_5
627
+ value: 16.54
628
+ - task:
629
+ type: Retrieval
630
+ dataset:
631
+ type: BeIR/cqadupstack
632
+ name: MTEB CQADupstackProgrammersRetrieval
633
+ config: default
634
+ split: test
635
+ revision: None
636
+ metrics:
637
+ - type: map_at_1
638
+ value: 5.682
639
+ - type: map_at_10
640
+ value: 7.817
641
+ - type: map_at_100
642
+ value: 8.3
643
+ - type: map_at_1000
644
+ value: 8.378
645
+ - type: map_at_3
646
+ value: 7.13
647
+ - type: map_at_5
648
+ value: 7.467
649
+ - type: mrr_at_1
650
+ value: 6.848999999999999
651
+ - type: mrr_at_10
652
+ value: 9.687999999999999
653
+ - type: mrr_at_100
654
+ value: 10.208
655
+ - type: mrr_at_1000
656
+ value: 10.281
657
+ - type: mrr_at_3
658
+ value: 8.770999999999999
659
+ - type: mrr_at_5
660
+ value: 9.256
661
+ - type: ndcg_at_1
662
+ value: 6.848999999999999
663
+ - type: ndcg_at_10
664
+ value: 9.519
665
+ - type: ndcg_at_100
666
+ value: 12.303
667
+ - type: ndcg_at_1000
668
+ value: 15.004999999999999
669
+ - type: ndcg_at_3
670
+ value: 8.077
671
+ - type: ndcg_at_5
672
+ value: 8.656
673
+ - type: precision_at_1
674
+ value: 6.848999999999999
675
+ - type: precision_at_10
676
+ value: 1.735
677
+ - type: precision_at_100
678
+ value: 0.363
679
+ - type: precision_at_1000
680
+ value: 0.073
681
+ - type: precision_at_3
682
+ value: 3.7289999999999996
683
+ - type: precision_at_5
684
+ value: 2.717
685
+ - type: recall_at_1
686
+ value: 5.682
687
+ - type: recall_at_10
688
+ value: 13.001
689
+ - type: recall_at_100
690
+ value: 25.916
691
+ - type: recall_at_1000
692
+ value: 46.303
693
+ - type: recall_at_3
694
+ value: 8.949
695
+ - type: recall_at_5
696
+ value: 10.413
697
+ - task:
698
+ type: Retrieval
699
+ dataset:
700
+ type: BeIR/cqadupstack
701
+ name: MTEB CQADupstackRetrieval
702
+ config: default
703
+ split: test
704
+ revision: None
705
+ metrics:
706
+ - type: map_at_1
707
+ value: 5.441
708
+ - type: map_at_10
709
+ value: 7.997500000000002
710
+ - type: map_at_100
711
+ value: 8.47225
712
+ - type: map_at_1000
713
+ value: 8.557083333333333
714
+ - type: map_at_3
715
+ value: 7.17025
716
+ - type: map_at_5
717
+ value: 7.597833333333333
718
+ - type: mrr_at_1
719
+ value: 6.6329166666666675
720
+ - type: mrr_at_10
721
+ value: 9.596583333333333
722
+ - type: mrr_at_100
723
+ value: 10.094416666666667
724
+ - type: mrr_at_1000
725
+ value: 10.171583333333334
726
+ - type: mrr_at_3
727
+ value: 8.628416666666666
728
+ - type: mrr_at_5
729
+ value: 9.143416666666667
730
+ - type: ndcg_at_1
731
+ value: 6.6329166666666675
732
+ - type: ndcg_at_10
733
+ value: 9.81258333333333
734
+ - type: ndcg_at_100
735
+ value: 12.459416666666666
736
+ - type: ndcg_at_1000
737
+ value: 15.099416666666668
738
+ - type: ndcg_at_3
739
+ value: 8.177499999999998
740
+ - type: ndcg_at_5
741
+ value: 8.8765
742
+ - type: precision_at_1
743
+ value: 6.6329166666666675
744
+ - type: precision_at_10
745
+ value: 1.8355833333333336
746
+ - type: precision_at_100
747
+ value: 0.38033333333333336
748
+ - type: precision_at_1000
749
+ value: 0.07358333333333333
750
+ - type: precision_at_3
751
+ value: 3.912583333333333
752
+ - type: precision_at_5
753
+ value: 2.8570833333333336
754
+ - type: recall_at_1
755
+ value: 5.441
756
+ - type: recall_at_10
757
+ value: 13.79075
758
+ - type: recall_at_100
759
+ value: 26.12841666666667
760
+ - type: recall_at_1000
761
+ value: 46.1115
762
+ - type: recall_at_3
763
+ value: 9.212416666666666
764
+ - type: recall_at_5
765
+ value: 11.006499999999999
766
+ - task:
767
+ type: Retrieval
768
+ dataset:
769
+ type: BeIR/cqadupstack
770
+ name: MTEB CQADupstackStatsRetrieval
771
+ config: default
772
+ split: test
773
+ revision: None
774
+ metrics:
775
+ - type: map_at_1
776
+ value: 4.973000000000001
777
+ - type: map_at_10
778
+ value: 6.583
779
+ - type: map_at_100
780
+ value: 7.013999999999999
781
+ - type: map_at_1000
782
+ value: 7.084
783
+ - type: map_at_3
784
+ value: 5.987
785
+ - type: map_at_5
786
+ value: 6.283999999999999
787
+ - type: mrr_at_1
788
+ value: 6.135
789
+ - type: mrr_at_10
790
+ value: 7.911
791
+ - type: mrr_at_100
792
+ value: 8.381
793
+ - type: mrr_at_1000
794
+ value: 8.451
795
+ - type: mrr_at_3
796
+ value: 7.234
797
+ - type: mrr_at_5
798
+ value: 7.595000000000001
799
+ - type: ndcg_at_1
800
+ value: 6.135
801
+ - type: ndcg_at_10
802
+ value: 7.8420000000000005
803
+ - type: ndcg_at_100
804
+ value: 10.335999999999999
805
+ - type: ndcg_at_1000
806
+ value: 12.742999999999999
807
+ - type: ndcg_at_3
808
+ value: 6.622
809
+ - type: ndcg_at_5
810
+ value: 7.156
811
+ - type: precision_at_1
812
+ value: 6.135
813
+ - type: precision_at_10
814
+ value: 1.3339999999999999
815
+ - type: precision_at_100
816
+ value: 0.293
817
+ - type: precision_at_1000
818
+ value: 0.053
819
+ - type: precision_at_3
820
+ value: 2.965
821
+ - type: precision_at_5
822
+ value: 2.086
823
+ - type: recall_at_1
824
+ value: 4.973000000000001
825
+ - type: recall_at_10
826
+ value: 10.497
827
+ - type: recall_at_100
828
+ value: 22.389
829
+ - type: recall_at_1000
830
+ value: 41.751
831
+ - type: recall_at_3
832
+ value: 7.248
833
+ - type: recall_at_5
834
+ value: 8.526
835
+ - task:
836
+ type: Retrieval
837
+ dataset:
838
+ type: BeIR/cqadupstack
839
+ name: MTEB CQADupstackTexRetrieval
840
+ config: default
841
+ split: test
842
+ revision: None
843
+ metrics:
844
+ - type: map_at_1
845
+ value: 2.541
846
+ - type: map_at_10
847
+ value: 4.168
848
+ - type: map_at_100
849
+ value: 4.492
850
+ - type: map_at_1000
851
+ value: 4.553
852
+ - type: map_at_3
853
+ value: 3.62
854
+ - type: map_at_5
855
+ value: 3.927
856
+ - type: mrr_at_1
857
+ value: 3.131
858
+ - type: mrr_at_10
859
+ value: 5.037
860
+ - type: mrr_at_100
861
+ value: 5.428
862
+ - type: mrr_at_1000
863
+ value: 5.487
864
+ - type: mrr_at_3
865
+ value: 4.422000000000001
866
+ - type: mrr_at_5
867
+ value: 4.752
868
+ - type: ndcg_at_1
869
+ value: 3.131
870
+ - type: ndcg_at_10
871
+ value: 5.315
872
+ - type: ndcg_at_100
873
+ value: 7.207
874
+ - type: ndcg_at_1000
875
+ value: 9.271
876
+ - type: ndcg_at_3
877
+ value: 4.244
878
+ - type: ndcg_at_5
879
+ value: 4.742
880
+ - type: precision_at_1
881
+ value: 3.131
882
+ - type: precision_at_10
883
+ value: 1.0699999999999998
884
+ - type: precision_at_100
885
+ value: 0.247
886
+ - type: precision_at_1000
887
+ value: 0.053
888
+ - type: precision_at_3
889
+ value: 2.1340000000000003
890
+ - type: precision_at_5
891
+ value: 1.624
892
+ - type: recall_at_1
893
+ value: 2.541
894
+ - type: recall_at_10
895
+ value: 7.8740000000000006
896
+ - type: recall_at_100
897
+ value: 16.896
898
+ - type: recall_at_1000
899
+ value: 32.423
900
+ - type: recall_at_3
901
+ value: 4.925
902
+ - type: recall_at_5
903
+ value: 6.181
904
+ - task:
905
+ type: Retrieval
906
+ dataset:
907
+ type: BeIR/cqadupstack
908
+ name: MTEB CQADupstackUnixRetrieval
909
+ config: default
910
+ split: test
911
+ revision: None
912
+ metrics:
913
+ - type: map_at_1
914
+ value: 5.58
915
+ - type: map_at_10
916
+ value: 7.758
917
+ - type: map_at_100
918
+ value: 8.168000000000001
919
+ - type: map_at_1000
920
+ value: 8.239
921
+ - type: map_at_3
922
+ value: 6.895999999999999
923
+ - type: map_at_5
924
+ value: 7.412000000000001
925
+ - type: mrr_at_1
926
+ value: 6.81
927
+ - type: mrr_at_10
928
+ value: 9.295
929
+ - type: mrr_at_100
930
+ value: 9.763
931
+ - type: mrr_at_1000
932
+ value: 9.835
933
+ - type: mrr_at_3
934
+ value: 8.427
935
+ - type: mrr_at_5
936
+ value: 8.958
937
+ - type: ndcg_at_1
938
+ value: 6.81
939
+ - type: ndcg_at_10
940
+ value: 9.436
941
+ - type: ndcg_at_100
942
+ value: 11.955
943
+ - type: ndcg_at_1000
944
+ value: 14.387
945
+ - type: ndcg_at_3
946
+ value: 7.7410000000000005
947
+ - type: ndcg_at_5
948
+ value: 8.622
949
+ - type: precision_at_1
950
+ value: 6.81
951
+ - type: precision_at_10
952
+ value: 1.6230000000000002
953
+ - type: precision_at_100
954
+ value: 0.335
955
+ - type: precision_at_1000
956
+ value: 0.062
957
+ - type: precision_at_3
958
+ value: 3.576
959
+ - type: precision_at_5
960
+ value: 2.6870000000000003
961
+ - type: recall_at_1
962
+ value: 5.58
963
+ - type: recall_at_10
964
+ value: 13.232
965
+ - type: recall_at_100
966
+ value: 25.233
967
+ - type: recall_at_1000
968
+ value: 43.864999999999995
969
+ - type: recall_at_3
970
+ value: 8.549
971
+ - type: recall_at_5
972
+ value: 10.799
973
+ - task:
974
+ type: Retrieval
975
+ dataset:
976
+ type: BeIR/cqadupstack
977
+ name: MTEB CQADupstackWebmastersRetrieval
978
+ config: default
979
+ split: test
980
+ revision: None
981
+ metrics:
982
+ - type: map_at_1
983
+ value: 3.8739999999999997
984
+ - type: map_at_10
985
+ value: 6.491
986
+ - type: map_at_100
987
+ value: 7.065
988
+ - type: map_at_1000
989
+ value: 7.185
990
+ - type: map_at_3
991
+ value: 5.568
992
+ - type: map_at_5
993
+ value: 6.1080000000000005
994
+ - type: mrr_at_1
995
+ value: 5.335999999999999
996
+ - type: mrr_at_10
997
+ value: 8.288
998
+ - type: mrr_at_100
999
+ value: 8.886
1000
+ - type: mrr_at_1000
1001
+ value: 8.976
1002
+ - type: mrr_at_3
1003
+ value: 7.115
1004
+ - type: mrr_at_5
1005
+ value: 7.846
1006
+ - type: ndcg_at_1
1007
+ value: 5.335999999999999
1008
+ - type: ndcg_at_10
1009
+ value: 8.463
1010
+ - type: ndcg_at_100
1011
+ value: 11.456
1012
+ - type: ndcg_at_1000
1013
+ value: 14.662
1014
+ - type: ndcg_at_3
1015
+ value: 6.7589999999999995
1016
+ - type: ndcg_at_5
1017
+ value: 7.5969999999999995
1018
+ - type: precision_at_1
1019
+ value: 5.335999999999999
1020
+ - type: precision_at_10
1021
+ value: 1.9369999999999998
1022
+ - type: precision_at_100
1023
+ value: 0.498
1024
+ - type: precision_at_1000
1025
+ value: 0.116
1026
+ - type: precision_at_3
1027
+ value: 3.689
1028
+ - type: precision_at_5
1029
+ value: 2.9250000000000003
1030
+ - type: recall_at_1
1031
+ value: 3.8739999999999997
1032
+ - type: recall_at_10
1033
+ value: 12.281
1034
+ - type: recall_at_100
1035
+ value: 26.368000000000002
1036
+ - type: recall_at_1000
1037
+ value: 50.422
1038
+ - type: recall_at_3
1039
+ value: 7.353
1040
+ - type: recall_at_5
1041
+ value: 9.66
1042
+ - task:
1043
+ type: Retrieval
1044
+ dataset:
1045
+ type: BeIR/cqadupstack
1046
+ name: MTEB CQADupstackWordpressRetrieval
1047
+ config: default
1048
+ split: test
1049
+ revision: None
1050
+ metrics:
1051
+ - type: map_at_1
1052
+ value: 2.317
1053
+ - type: map_at_10
1054
+ value: 3.697
1055
+ - type: map_at_100
1056
+ value: 3.9370000000000003
1057
+ - type: map_at_1000
1058
+ value: 3.994
1059
+ - type: map_at_3
1060
+ value: 3.1329999999999996
1061
+ - type: map_at_5
1062
+ value: 3.45
1063
+ - type: mrr_at_1
1064
+ value: 2.588
1065
+ - type: mrr_at_10
1066
+ value: 4.168
1067
+ - type: mrr_at_100
1068
+ value: 4.421
1069
+ - type: mrr_at_1000
1070
+ value: 4.481
1071
+ - type: mrr_at_3
1072
+ value: 3.512
1073
+ - type: mrr_at_5
1074
+ value: 3.909
1075
+ - type: ndcg_at_1
1076
+ value: 2.588
1077
+ - type: ndcg_at_10
1078
+ value: 4.679
1079
+ - type: ndcg_at_100
1080
+ value: 6.114
1081
+ - type: ndcg_at_1000
1082
+ value: 8.167
1083
+ - type: ndcg_at_3
1084
+ value: 3.5290000000000004
1085
+ - type: ndcg_at_5
1086
+ value: 4.093999999999999
1087
+ - type: precision_at_1
1088
+ value: 2.588
1089
+ - type: precision_at_10
1090
+ value: 0.832
1091
+ - type: precision_at_100
1092
+ value: 0.165
1093
+ - type: precision_at_1000
1094
+ value: 0.037
1095
+ - type: precision_at_3
1096
+ value: 1.6019999999999999
1097
+ - type: precision_at_5
1098
+ value: 1.294
1099
+ - type: recall_at_1
1100
+ value: 2.317
1101
+ - type: recall_at_10
1102
+ value: 7.338
1103
+ - type: recall_at_100
1104
+ value: 14.507
1105
+ - type: recall_at_1000
1106
+ value: 31.226
1107
+ - type: recall_at_3
1108
+ value: 4.258
1109
+ - type: recall_at_5
1110
+ value: 5.582
1111
+ - task:
1112
+ type: Classification
1113
+ dataset:
1114
+ type: mteb/emotion
1115
+ name: MTEB EmotionClassification
1116
+ config: default
1117
+ split: test
1118
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
1119
+ metrics:
1120
+ - type: accuracy
1121
+ value: 33.535
1122
+ - type: f1
1123
+ value: 29.64261331714107
1124
+ - task:
1125
+ type: Classification
1126
+ dataset:
1127
+ type: mteb/imdb
1128
+ name: MTEB ImdbClassification
1129
+ config: default
1130
+ split: test
1131
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
1132
+ metrics:
1133
+ - type: accuracy
1134
+ value: 57.03359999999999
1135
+ - type: ap
1136
+ value: 54.289515246345985
1137
+ - type: f1
1138
+ value: 56.404319444675686
1139
+ - task:
1140
+ type: Classification
1141
+ dataset:
1142
+ type: mteb/mtop_domain
1143
+ name: MTEB MTOPDomainClassification (en)
1144
+ config: en
1145
+ split: test
1146
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1147
+ metrics:
1148
+ - type: accuracy
1149
+ value: 86.70770633834928
1150
+ - type: f1
1151
+ value: 86.3521440956975
1152
+ - task:
1153
+ type: Classification
1154
+ dataset:
1155
+ type: mteb/mtop_intent
1156
+ name: MTEB MTOPIntentClassification (en)
1157
+ config: en
1158
+ split: test
1159
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1160
+ metrics:
1161
+ - type: accuracy
1162
+ value: 62.35750113999089
1163
+ - type: f1
1164
+ value: 41.01929492285308
1165
+ - task:
1166
+ type: Classification
1167
+ dataset:
1168
+ type: mteb/amazon_massive_intent
1169
+ name: MTEB MassiveIntentClassification (en)
1170
+ config: en
1171
+ split: test
1172
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
1173
+ metrics:
1174
+ - type: accuracy
1175
+ value: 63.34902488231339
1176
+ - type: f1
1177
+ value: 59.90320313789715
1178
+ - task:
1179
+ type: Classification
1180
+ dataset:
1181
+ type: mteb/amazon_massive_scenario
1182
+ name: MTEB MassiveScenarioClassification (en)
1183
+ config: en
1184
+ split: test
1185
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
1186
+ metrics:
1187
+ - type: accuracy
1188
+ value: 72.51513113651649
1189
+ - type: f1
1190
+ value: 72.02695487206958
1191
+ - task:
1192
+ type: Clustering
1193
+ dataset:
1194
+ type: mteb/medrxiv-clustering-p2p
1195
+ name: MTEB MedrxivClusteringP2P
1196
+ config: default
1197
+ split: test
1198
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
1199
+ metrics:
1200
+ - type: v_measure
1201
+ value: 27.274796122083107
1202
+ - task:
1203
+ type: Clustering
1204
+ dataset:
1205
+ type: mteb/medrxiv-clustering-s2s
1206
+ name: MTEB MedrxivClusteringS2S
1207
+ config: default
1208
+ split: test
1209
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
1210
+ metrics:
1211
+ - type: v_measure
1212
+ value: 26.79725352760558
1213
+ - task:
1214
+ type: Reranking
1215
+ dataset:
1216
+ type: mteb/mind_small
1217
+ name: MTEB MindSmallReranking
1218
+ config: default
1219
+ split: test
1220
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
1221
+ metrics:
1222
+ - type: map
1223
+ value: 26.13036834909186
1224
+ - type: mrr
1225
+ value: 26.44693141383913
1226
+ - task:
1227
+ type: Clustering
1228
+ dataset:
1229
+ type: mteb/reddit-clustering
1230
+ name: MTEB RedditClustering
1231
+ config: default
1232
+ split: test
1233
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
1234
+ metrics:
1235
+ - type: v_measure
1236
+ value: 42.20822777687787
1237
+ - task:
1238
+ type: Clustering
1239
+ dataset:
1240
+ type: mteb/reddit-clustering-p2p
1241
+ name: MTEB RedditClusteringP2P
1242
+ config: default
1243
+ split: test
1244
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
1245
+ metrics:
1246
+ - type: v_measure
1247
+ value: 50.46829369249206
1248
+ - task:
1249
+ type: STS
1250
+ dataset:
1251
+ type: mteb/sickr-sts
1252
+ name: MTEB SICK-R
1253
+ config: default
1254
+ split: test
1255
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1256
+ metrics:
1257
+ - type: cos_sim_pearson
1258
+ value: 68.56822847816088
1259
+ - type: cos_sim_spearman
1260
+ value: 67.89762106712074
1261
+ - type: euclidean_pearson
1262
+ value: 72.85990051290051
1263
+ - type: euclidean_spearman
1264
+ value: 70.57485701927138
1265
+ - type: manhattan_pearson
1266
+ value: 75.55042864114424
1267
+ - type: manhattan_spearman
1268
+ value: 71.93915751894929
1269
+ - task:
1270
+ type: STS
1271
+ dataset:
1272
+ type: mteb/sts12-sts
1273
+ name: MTEB STS12
1274
+ config: default
1275
+ split: test
1276
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1277
+ metrics:
1278
+ - type: cos_sim_pearson
1279
+ value: 75.78267692127127
1280
+ - type: cos_sim_spearman
1281
+ value: 72.29619737860627
1282
+ - type: euclidean_pearson
1283
+ value: 70.1450545025718
1284
+ - type: euclidean_spearman
1285
+ value: 67.45917489688871
1286
+ - type: manhattan_pearson
1287
+ value: 71.38506807589515
1288
+ - type: manhattan_spearman
1289
+ value: 67.2756870294321
1290
+ - task:
1291
+ type: STS
1292
+ dataset:
1293
+ type: mteb/sts13-sts
1294
+ name: MTEB STS13
1295
+ config: default
1296
+ split: test
1297
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1298
+ metrics:
1299
+ - type: cos_sim_pearson
1300
+ value: 58.319097196559234
1301
+ - type: cos_sim_spearman
1302
+ value: 64.92943196450905
1303
+ - type: euclidean_pearson
1304
+ value: 66.58719740666398
1305
+ - type: euclidean_spearman
1306
+ value: 67.53564380155727
1307
+ - type: manhattan_pearson
1308
+ value: 68.40736205376945
1309
+ - type: manhattan_spearman
1310
+ value: 68.83617823881784
1311
+ - task:
1312
+ type: STS
1313
+ dataset:
1314
+ type: mteb/sts14-sts
1315
+ name: MTEB STS14
1316
+ config: default
1317
+ split: test
1318
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
1319
+ metrics:
1320
+ - type: cos_sim_pearson
1321
+ value: 53.31328133696752
1322
+ - type: cos_sim_spearman
1323
+ value: 54.95348091071938
1324
+ - type: euclidean_pearson
1325
+ value: 62.387046499499476
1326
+ - type: euclidean_spearman
1327
+ value: 61.1353898211832
1328
+ - type: manhattan_pearson
1329
+ value: 65.6417443455959
1330
+ - type: manhattan_spearman
1331
+ value: 63.242670107784384
1332
+ - task:
1333
+ type: STS
1334
+ dataset:
1335
+ type: mteb/sts15-sts
1336
+ name: MTEB STS15
1337
+ config: default
1338
+ split: test
1339
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
1340
+ metrics:
1341
+ - type: cos_sim_pearson
1342
+ value: 60.528757851414014
1343
+ - type: cos_sim_spearman
1344
+ value: 64.23576213334218
1345
+ - type: euclidean_pearson
1346
+ value: 72.97957845156205
1347
+ - type: euclidean_spearman
1348
+ value: 73.65719038687413
1349
+ - type: manhattan_pearson
1350
+ value: 74.78225875672878
1351
+ - type: manhattan_spearman
1352
+ value: 75.49116886100272
1353
+ - task:
1354
+ type: STS
1355
+ dataset:
1356
+ type: mteb/sts16-sts
1357
+ name: MTEB STS16
1358
+ config: default
1359
+ split: test
1360
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
1361
+ metrics:
1362
+ - type: cos_sim_pearson
1363
+ value: 61.987373739107696
1364
+ - type: cos_sim_spearman
1365
+ value: 70.192277875975
1366
+ - type: euclidean_pearson
1367
+ value: 72.63709361494375
1368
+ - type: euclidean_spearman
1369
+ value: 73.11242796462018
1370
+ - type: manhattan_pearson
1371
+ value: 73.72926634930128
1372
+ - type: manhattan_spearman
1373
+ value: 73.98477033865957
1374
+ - task:
1375
+ type: STS
1376
+ dataset:
1377
+ type: mteb/sts17-crosslingual-sts
1378
+ name: MTEB STS17 (en-en)
1379
+ config: en-en
1380
+ split: test
1381
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1382
+ metrics:
1383
+ - type: cos_sim_pearson
1384
+ value: 70.04069064325459
1385
+ - type: cos_sim_spearman
1386
+ value: 74.38400000348688
1387
+ - type: euclidean_pearson
1388
+ value: 82.08401389635375
1389
+ - type: euclidean_spearman
1390
+ value: 81.95480539585296
1391
+ - type: manhattan_pearson
1392
+ value: 84.99052315893229
1393
+ - type: manhattan_spearman
1394
+ value: 84.66072647748268
1395
+ - task:
1396
+ type: STS
1397
+ dataset:
1398
+ type: mteb/sts22-crosslingual-sts
1399
+ name: MTEB STS22 (en)
1400
+ config: en
1401
+ split: test
1402
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
1403
+ metrics:
1404
+ - type: cos_sim_pearson
1405
+ value: 48.154362861306986
1406
+ - type: cos_sim_spearman
1407
+ value: 48.58749841932341
1408
+ - type: euclidean_pearson
1409
+ value: 50.41642902043279
1410
+ - type: euclidean_spearman
1411
+ value: 51.371094727414935
1412
+ - type: manhattan_pearson
1413
+ value: 53.06081362594791
1414
+ - type: manhattan_spearman
1415
+ value: 52.92177971301313
1416
+ - task:
1417
+ type: STS
1418
+ dataset:
1419
+ type: mteb/stsbenchmark-sts
1420
+ name: MTEB STSBenchmark
1421
+ config: default
1422
+ split: test
1423
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
1424
+ metrics:
1425
+ - type: cos_sim_pearson
1426
+ value: 59.11445268439452
1427
+ - type: cos_sim_spearman
1428
+ value: 61.46376153396639
1429
+ - type: euclidean_pearson
1430
+ value: 70.4367704900615
1431
+ - type: euclidean_spearman
1432
+ value: 69.71716383694748
1433
+ - type: manhattan_pearson
1434
+ value: 72.72973072359753
1435
+ - type: manhattan_spearman
1436
+ value: 71.48785771698903
1437
+ - task:
1438
+ type: Reranking
1439
+ dataset:
1440
+ type: mteb/scidocs-reranking
1441
+ name: MTEB SciDocsRR
1442
+ config: default
1443
+ split: test
1444
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
1445
+ metrics:
1446
+ - type: map
1447
+ value: 69.56970649232905
1448
+ - type: mrr
1449
+ value: 89.47439089595952
1450
+ - task:
1451
+ type: PairClassification
1452
+ dataset:
1453
+ type: mteb/sprintduplicatequestions-pairclassification
1454
+ name: MTEB SprintDuplicateQuestions
1455
+ config: default
1456
+ split: test
1457
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
1458
+ metrics:
1459
+ - type: cos_sim_accuracy
1460
+ value: 99.6009900990099
1461
+ - type: cos_sim_ap
1462
+ value: 85.25193332879603
1463
+ - type: cos_sim_f1
1464
+ value: 78.88563049853373
1465
+ - type: cos_sim_precision
1466
+ value: 77.151051625239
1467
+ - type: cos_sim_recall
1468
+ value: 80.7
1469
+ - type: dot_accuracy
1470
+ value: 99.01287128712872
1471
+ - type: dot_ap
1472
+ value: 7.20643686800152
1473
+ - type: dot_f1
1474
+ value: 14.143920595533496
1475
+ - type: dot_precision
1476
+ value: 9.405940594059405
1477
+ - type: dot_recall
1478
+ value: 28.499999999999996
1479
+ - type: euclidean_accuracy
1480
+ value: 99.590099009901
1481
+ - type: euclidean_ap
1482
+ value: 83.37987878104964
1483
+ - type: euclidean_f1
1484
+ value: 78.22990844354018
1485
+ - type: euclidean_precision
1486
+ value: 79.60662525879917
1487
+ - type: euclidean_recall
1488
+ value: 76.9
1489
+ - type: manhattan_accuracy
1490
+ value: 99.609900990099
1491
+ - type: manhattan_ap
1492
+ value: 85.6481020725528
1493
+ - type: manhattan_f1
1494
+ value: 79.23790913531998
1495
+ - type: manhattan_precision
1496
+ value: 77.45940783190068
1497
+ - type: manhattan_recall
1498
+ value: 81.10000000000001
1499
+ - type: max_accuracy
1500
+ value: 99.609900990099
1501
+ - type: max_ap
1502
+ value: 85.6481020725528
1503
+ - type: max_f1
1504
+ value: 79.23790913531998
1505
+ - task:
1506
+ type: Clustering
1507
+ dataset:
1508
+ type: mteb/stackexchange-clustering
1509
+ name: MTEB StackExchangeClustering
1510
+ config: default
1511
+ split: test
1512
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
1513
+ metrics:
1514
+ - type: v_measure
1515
+ value: 51.49824324480644
1516
+ - task:
1517
+ type: Clustering
1518
+ dataset:
1519
+ type: mteb/stackexchange-clustering-p2p
1520
+ name: MTEB StackExchangeClusteringP2P
1521
+ config: default
1522
+ split: test
1523
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
1524
+ metrics:
1525
+ - type: v_measure
1526
+ value: 29.27365407025942
1527
+ - task:
1528
+ type: Reranking
1529
+ dataset:
1530
+ type: mteb/stackoverflowdupquestions-reranking
1531
+ name: MTEB StackOverflowDupQuestions
1532
+ config: default
1533
+ split: test
1534
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
1535
+ metrics:
1536
+ - type: map
1537
+ value: 37.62142967031895
1538
+ - type: mrr
1539
+ value: 37.80931690858162
1540
+ - task:
1541
+ type: Summarization
1542
+ dataset:
1543
+ type: mteb/summeval
1544
+ name: MTEB SummEval
1545
+ config: default
1546
+ split: test
1547
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
1548
+ metrics:
1549
+ - type: cos_sim_pearson
1550
+ value: 27.279280594311935
1551
+ - type: cos_sim_spearman
1552
+ value: 28.055012324260563
1553
+ - type: dot_pearson
1554
+ value: 19.315154386546453
1555
+ - type: dot_spearman
1556
+ value: 19.17304603866006
1557
+ - task:
1558
+ type: Classification
1559
+ dataset:
1560
+ type: mteb/toxic_conversations_50k
1561
+ name: MTEB ToxicConversationsClassification
1562
+ config: default
1563
+ split: test
1564
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
1565
+ metrics:
1566
+ - type: accuracy
1567
+ value: 63.2888
1568
+ - type: ap
1569
+ value: 11.062527367094436
1570
+ - type: f1
1571
+ value: 48.6893658037416
1572
+ - task:
1573
+ type: Classification
1574
+ dataset:
1575
+ type: mteb/tweet_sentiment_extraction
1576
+ name: MTEB TweetSentimentExtractionClassification
1577
+ config: default
1578
+ split: test
1579
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
1580
+ metrics:
1581
+ - type: accuracy
1582
+ value: 49.275608375778155
1583
+ - type: f1
1584
+ value: 49.487704374827324
1585
+ - task:
1586
+ type: Clustering
1587
+ dataset:
1588
+ type: mteb/twentynewsgroups-clustering
1589
+ name: MTEB TwentyNewsgroupsClustering
1590
+ config: default
1591
+ split: test
1592
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
1593
+ metrics:
1594
+ - type: v_measure
1595
+ value: 37.31132794113957
1596
+ - task:
1597
+ type: PairClassification
1598
+ dataset:
1599
+ type: mteb/twittersemeval2015-pairclassification
1600
+ name: MTEB TwitterSemEval2015
1601
+ config: default
1602
+ split: test
1603
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
1604
+ metrics:
1605
+ - type: cos_sim_accuracy
1606
+ value: 80.008344757704
1607
+ - type: cos_sim_ap
1608
+ value: 50.955726976655036
1609
+ - type: cos_sim_f1
1610
+ value: 49.800796812749006
1611
+ - type: cos_sim_precision
1612
+ value: 42.8898208158597
1613
+ - type: cos_sim_recall
1614
+ value: 59.36675461741425
1615
+ - type: dot_accuracy
1616
+ value: 77.42743041068128
1617
+ - type: dot_ap
1618
+ value: 19.216239898966027
1619
+ - type: dot_f1
1620
+ value: 36.95323548056761
1621
+ - type: dot_precision
1622
+ value: 22.665550038882575
1623
+ - type: dot_recall
1624
+ value: 99.9736147757256
1625
+ - type: euclidean_accuracy
1626
+ value: 81.12296596530965
1627
+ - type: euclidean_ap
1628
+ value: 55.99371814327642
1629
+ - type: euclidean_f1
1630
+ value: 54.55376528396755
1631
+ - type: euclidean_precision
1632
+ value: 48.11529933481153
1633
+ - type: euclidean_recall
1634
+ value: 62.98153034300792
1635
+ - type: manhattan_accuracy
1636
+ value: 81.3673481552125
1637
+ - type: manhattan_ap
1638
+ value: 57.126538198748456
1639
+ - type: manhattan_f1
1640
+ value: 55.38567651454189
1641
+ - type: manhattan_precision
1642
+ value: 49.073130983907106
1643
+ - type: manhattan_recall
1644
+ value: 63.562005277044854
1645
+ - type: max_accuracy
1646
+ value: 81.3673481552125
1647
+ - type: max_ap
1648
+ value: 57.126538198748456
1649
+ - type: max_f1
1650
+ value: 55.38567651454189
1651
+ - task:
1652
+ type: PairClassification
1653
+ dataset:
1654
+ type: mteb/twitterurlcorpus-pairclassification
1655
+ name: MTEB TwitterURLCorpus
1656
+ config: default
1657
+ split: test
1658
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
1659
+ metrics:
1660
+ - type: cos_sim_accuracy
1661
+ value: 84.02607986960065
1662
+ - type: cos_sim_ap
1663
+ value: 74.07757228336027
1664
+ - type: cos_sim_f1
1665
+ value: 66.0694778239021
1666
+ - type: cos_sim_precision
1667
+ value: 62.67790520934089
1668
+ - type: cos_sim_recall
1669
+ value: 69.84909146904835
1670
+ - type: dot_accuracy
1671
+ value: 74.79722125198897
1672
+ - type: dot_ap
1673
+ value: 25.478024888904727
1674
+ - type: dot_f1
1675
+ value: 40.76642277589147
1676
+ - type: dot_precision
1677
+ value: 25.705095989546688
1678
+ - type: dot_recall
1679
+ value: 98.45241761626117
1680
+ - type: euclidean_accuracy
1681
+ value: 85.51053673303062
1682
+ - type: euclidean_ap
1683
+ value: 78.24178926488659
1684
+ - type: euclidean_f1
1685
+ value: 70.50944224857267
1686
+ - type: euclidean_precision
1687
+ value: 67.19447544642857
1688
+ - type: euclidean_recall
1689
+ value: 74.16846319679703
1690
+ - type: manhattan_accuracy
1691
+ value: 85.72398804672643
1692
+ - type: manhattan_ap
1693
+ value: 78.90411073933831
1694
+ - type: manhattan_f1
1695
+ value: 70.90586145648314
1696
+ - type: manhattan_precision
1697
+ value: 65.8224508640021
1698
+ - type: manhattan_recall
1699
+ value: 76.84016014782877
1700
+ - type: max_accuracy
1701
+ value: 85.72398804672643
1702
+ - type: max_ap
1703
+ value: 78.90411073933831
1704
+ - type: max_f1
1705
+ value: 70.90586145648314
1706
  ---
1707
 
1708
  # clip-ViT-B-32