Shitao commited on
Commit
2f21a44
·
verified ·
1 Parent(s): bd0b867

Upload folder using huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +4 -32
README.md CHANGED
@@ -2618,41 +2618,13 @@ print(scores.tolist())
2618
 
2619
  `bge-en-icl` achieve **state-of-the-art performance on both MTEB and AIR-Bench leaderboard!**
2620
 
2621
- - **MTEB**:
2622
 
2623
- | MTEB | STS (10) | Summarization (1) | Pair Classification (3) | Classification (12) | Reranking (4) | Clustering (11) | Retrieval (15) | ALL (56) |
2624
- |:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|
2625
- | **e5-mistral-7b-instruct** | 84.62 | 31.40 | 88.37 | 78.48 | 60.20 | 50.26 | 56.89 | 66.60 |
2626
- | **SFR-Embedding-Mistral** | **85.05** | 31.16 | **88.54** | 78.33 | 60.64 | 51.67 | 59.03 | 67.56 |
2627
- | **NV-Embed-v1** | 82.84 | 31.20 | 86.91 | 87.35 | 60.54 | 52.80 | 59.36 | 69.32 |
2628
- | **Linq-Embed-Mistral** | 84.97 | 31.00 | 88.35 | 80.16 | 60.29 | 51.42 | 60.19 | 68.17 |
2629
- | **SFR-Embedding-2_R** | 81.26 | 30.71 | 88.07 | 89.05 | 60.14 | 56.17 | 60.18 | 70.31 |
2630
- | **gte-Qwen2-7B-instruct** | 83.04 | 31.35 | 85.79 | 86.58 | **61.42** | 56.92 | 60.25 | 70.24 |
2631
- | **stella_en_1.5B_v5** | 84.51 | **31.49** | 88.07 | 88.07 | 61.21 | 57.69 | 61.21 | 71.19 |
2632
- | **bge-multilingual-gemma2** | 83.88 | 31.20 | 85.84 | 88.08 | 59.72 | 54.65 | 59.24 | 69.88 |
2633
- | **bge-en-icl zero-shot** | 83.74 | 30.75 | 87.21 | 88.66 | 59.66 | 57.57 | 61.67 | 71.26 |
2634
- | **bge-en-icl few-shot** | 84.25 | 30.77 | 88.38 | **88.99** | 59.82 | **57.89** | **62.16** | **71.69** |
2635
 
2636
- - **BEIR**:
2637
 
2638
- | BEIR | e5-mistral-7b-instruct | SFR-Embedding-Mistral | NV-Embed-v1 | Linq-Embed-Mistral | SFR-Embedding-2_R | gte-Qwen2-7B-instruct | stella_en _1.5B_v5 | bge-multilingual-gemma2 | bge-en-icl zero-shot | bge-en-icl few-shot |
2639
- | :----------------: | :--------------------: | :-------------------: | :---------: | :----------------: | :---------------: | :-------------------: | :----------------: | :---------------------: | :----------------------: | :---------------------: |
2640
- | **ArguAna** | 61.9 | 67.27 | 68.21 | 69.65 | 62.34 | 64.27 | 65.27 | 77.37 | 82.76 | **83.08** |
2641
- | **ClimateFEVER** | 38.4 | 36.41 | 34.72 | 39.11 | 34.43 | **45.88** | 46.11 | 39.37 | 45.35 | 45.43 |
2642
- | **CQA** | 43 | 46.54 | **50.51** | 47.27 | 46.11 | 46.43 | 47.75 | 47.94 | 47.23 | 47.31 |
2643
- | **DBPedia** | 48.9 | 49.06 | 48.29 | 51.32 | 51.21 | **52.42** | 52.28 | 51.37 | 50.42 | 51.63 |
2644
- | **FEVER** | 87.8 | 89.35 | 87.77 | 92.42 | 92.16 | **95.11** | 94.83 | 90.38 | 91.96 | 92.83 |
2645
- | **FiQA2018** | 56.6 | 60.55 | **63.1** | 61.2 | 61.77 | 62.03 | 60.48 | 60.04 | 58.77 | 59.67 |
2646
- | **HotpotQA** | 75.7 | 77.02 | 79.92 | 76.24 | 81.36 | 73.08 | 76.67 | 83.26 | 84.98 | **85.14** |
2647
- | **MSMARCO** | 43.1 | 43.41 | 46.49 | 45.21 | 42.18 | 45.98 | 45.22 | 45.71 | 46.72 | **46.79** |
2648
- | **NFCorpus** | 38.6 | 42.02 | 38.04 | 41.62 | 41.34 | 40.6 | **42** | 38.11 | 40.69 | 41.85 |
2649
- | **NQ** | 63.5 | 69.92 | 71.22 | 70.63 | 73.96 | 67 | 71.8 | 71.45 | 73.85 | **73.88** |
2650
- | **QuoraRetrieval** | 89.6 | 89.81 | 89.21 | 90.27 | 89.58 | 90.09 | 90.03 | 90.04 | 91.02 | **90.95** |
2651
- | **SCIDOCS** | 16.3 | 19.91 | 20.19 | 21.93 | 24.87 | **28.91** | 26.64 | 26.93 | 25.25 | 25.26 |
2652
- | **SciFact** | 76.4 | 78.06 | 78.43 | 78.32 | **85.91** | 79.06 | 80.09 | 72.05 | 78.33 | 79.09 |
2653
- | **Touche2020** | 26.4 | 29 | 28.38 | **30.61** | 28.18 | 30.57 | 29.94 | 30.26 | 29.67 | 30.48 |
2654
- | **TRECCOVID** | 87.2 | 87.1 | 85.88 | 87.1 | **87.28** | 82.26 | 85.98 | 64.27 | 78.11 | 79.08 |
2655
- | **Mean** | 56.89 | 59.03 | 59.36 | 60.19 | 60.18 | 60.25 | 61.21 | 59.24 | 61.67 | **62.16** |
2656
 
2657
  - **AIR-Bench**:
2658
 
 
2618
 
2619
  `bge-en-icl` achieve **state-of-the-art performance on both MTEB and AIR-Bench leaderboard!**
2620
 
2621
+ - **MTEB**:
2622
 
2623
+ ![BEIR](.\results\BEIR.png)
 
 
 
 
 
 
 
 
 
 
 
2624
 
2625
+ - **BEIR**:
2626
 
2627
+ ![BEIR](.\results\BEIR.png)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2628
 
2629
  - **AIR-Bench**:
2630