Datasets:

Formats:
json
Languages:
English
Size:
< 1K
ArXiv:
Libraries:
Datasets
pandas
License:
mahirlabibdihan commited on
Commit
b963f64
·
verified ·
1 Parent(s): 8af6e1a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +20 -22
README.md CHANGED
@@ -52,28 +52,26 @@ for item in ds["test"]:
52
 
53
  ## Leaderboard
54
 
55
- | Model | Overall | Place Info | Nearby | Routing | Trip | Unanswerable |
56
- |--------------------------|:---------:|:------------:|:--------:|:---------:|:--------:|:--------------:|
57
- | Claude-3.5-Sonnet | **66.33** | **73.44** | 73.49 | **75.76** | **49.25** | 40.00 |
58
- | Gemini-1.5-Pro | **66.33** | 65.63 | **74.70** | 69.70 | 47.76 | **85.00** |
59
- | GPT-4o | 63.33 | 64.06 | **74.70** | 69.70 | **49.25** | 40.00 |
60
- | GPT-4-Turbo | 62.33 | 67.19 | 71.08 | 71.21 | 47.76 | 30.00 |
61
- | Gemini-1.5-Flash | 58.67 | 62.50 | 67.47 | 66.67 | 38.81 | 50.00 |
62
- | GPT-4o-mini | 51.00 | 46.88 | 63.86 | 57.58 | 40.30 | 25.00 |
63
- | GPT-3.5-Turbo | 37.67 | 26.56 | 53.01 | 48.48 | 28.36 | 5.00 |
64
- | Llama-3.1-70B | 61.00 | 70.31 | 67.47 | 69.70 | 40.30 | 45.00 |
65
- | Llama-3.2-90B | 58.33 | 68.75 | 66.27 | 66.67 | 38.81 | 30.00 |
66
- | Qwen2.5-72B | 57.00 | 62.50 | 71.08 | 63.64 | 41.79 | 10.00 |
67
- | Qwen2.5-14B | 53.67 | 57.81 | 71.08 | 59.09 | 32.84 | 20.00 |
68
- | Gemma-2.0-27B | 49.00 | 39.06 | 71.08 | 59.09 | 31.34 | 15.00 |
69
- | Gemma-2.0-9B | 47.33 | 50.00 | 50.60 | 59.09 | 34.33 | 30.00 |
70
- | Llama-3.1-8B | 44.00 | 53.13 | 57.83 | 45.45 | 23.88 | 20.00 |
71
- | Qwen2.5-7B | 43.33 | 48.44 | 49.40 | 42.42 | 38.81 | 20.00 |
72
- | Mistral-Nemo | 43.33 | 46.88 | 50.60 | 50.00 | 32.84 | 15.00 |
73
- | Mixtral-8x7B | 43.00 | 53.13 | 54.22 | 45.45 | 26.87 | 10.00 |
74
- | Phi-3.5-mini | 37.00 | 40.63 | 48.19 | 46.97 | 20.90 | 0.00 |
75
- | Llama-3.2-3B | 33.00 | 31.25 | 49.40 | 31.82 | 25.37 | 0.00 |
76
- | Human | 86.67 | 92.19 | 90.36 | 81.81 | 88.06 | 65.00 |
77
 
78
  ## Citation
79
 
 
52
 
53
  ## Leaderboard
54
 
55
+ | Model | Overall | Place Info | Nearby | Routing | Trip | Unanswerable |
56
+ |---------------------|:---------:|:------------:|:--------:|:---------:|:--------:|:--------------:|
57
+ | Claude-3.5-Sonnet | **64.00** | **68.75** | **55.42** | **65.15** | **71.64** | 55.00 |
58
+ | GPT-4-Turbo | 53.67 | 62.50 | 50.60 | 60.61 | 50.75 | 25.00 |
59
+ | GPT-4o | 48.67 | 59.38 | 40.96 | 50.00 | 56.72 | 15.00 |
60
+ | Gemini-1.5-Pro | 43.33 | 65.63 | 30.12 | 40.91 | 34.33 | **65.00** |
61
+ | Gemini-1.5-Flash | 41.67 | 51.56 | 38.55 | 46.97 | 34.33 | 30.00 |
62
+ | GPT-3.5-Turbo | 27.33 | 39.06 | 22.89 | 33.33 | 19.40 | 15.00 |
63
+ | GPT-4o-mini | 23.00 | 28.13 | 14.46 | 13.64 | 43.28 | 5.00 |
64
+ | Llama-3.2-90B | 39.67 | 54.69 | 37.35 | 39.39 | 35.82 | 15.00 |
65
+ | Llama-3.1-70B | 37.67 | 53.13 | 32.53 | 42.42 | 31.34 | 15.00 |
66
+ | Mixtral-8x7B | 27.67 | 32.81 | 18.07 | 27.27 | 38.81 | 15.00 |
67
+ | Gemma-2.0-9B | 27.00 | 35.94 | 14.46 | 28.79 | 26.87 | 45.00 |
68
+
69
+ #### Comparison between ReAct and Chameleon with GPT-3.5-Turbo
70
+
71
+ | Model | Overall | Place Info | Nearby | Routing | Trip | Unanswerable |
72
+ |----------------------------|:-------:|:----------:|:------:|:-------:|:------:|:------------:|
73
+ | ReAct | 27.33 | 39.06 | 22.89 | 33.33 | 19.40 | 15.00 |
74
+ | Chameleon | 49.33 | 54.69 | 54.21 | 51.51 | 43.28 | 25.00 |
 
 
75
 
76
  ## Citation
77