unknown
commited on
Commit
·
2d001b6
1
Parent(s):
b12f80d
add files
Browse files
README.md
CHANGED
@@ -101,7 +101,7 @@ This repository contains all fine-tuned models for experiments with ComBack.
|
|
101 |
| Model | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | BLEU4 | ED | BLEU4 | ED | BLEU4 | ED |
|
102 |
| ChatGPT-3.5-Turbo | 10.34 | 38.41 | 15.35 | 42.94 | 12.01 | 41.47 | 6.44 | 12.9 | 9.75 | 20.79 | 7.97 | 17.79 | 7.33 | 30.83 | 7.35 | 32.34 | 8.12 | 32.71 |
|
103 |
| Code-LLaMA-34B | 0.41 | 19.07 | 0.85 | 16.77 | 0.56 | 18.22 | 1.58 | 13.54 | 2.66 | 17.95 | 2.47 | 16.59 | 9.38 | 35.53 | 11.06 | 37.15 | 8.24 | 33.00 |
|
104 |
-
| CodeT5+-220m | **51.16** | **75.32** | **52.45** | **74.57** | **50.56** | **75.52** | **49.11** | **67.84** | **38.26** | **59.21** | **38.33** | **56.31** |
|
105 |
|
106 |
49.11 67.84 38.26 59.21 38.33 56.3
|
107 |
|
@@ -113,7 +113,7 @@ This repository contains all fine-tuned models for experiments with ComBack.
|
|
113 |
| Model | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | BLEU4 | ED | BLEU4 | ED | BLEU4 | ED |
|
114 |
| ChatGPT-3.5-Turbo | 12.08 | 41.39 | 16.77 | 42.02 | 14.73 | 43.72 | 9.80 | 21.86 | 10.81 | 20.66 | 11.39 | 22.82 | 9.24 | 32.13 | 11.96 | 35.33 | 10.07 | 32.90 |
|
115 |
| Code-LLaMA-34B | 0.45 | 17.61 | 0.61 | 17.21 | 0.99 | 17.23 | 1.75 | 15.04 | 0.42 | 11.27 | 2.42 | 16.25 | 6.92 | 32.54 | 8.95 | 38.22 | 8.20 | 34.16 |
|
116 |
-
| CodeT5+-220m | **62.68** | **82.02** | **71.34** | **85.98** | **64.45** | **81.53** | **48.71** | **68.95** | **58.68** | **74.57** | **47.81** | **65.5** |
|
117 |
|
118 |
|
119 |
|
|
|
101 |
| Model | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | BLEU4 | ED | BLEU4 | ED | BLEU4 | ED |
|
102 |
| ChatGPT-3.5-Turbo | 10.34 | 38.41 | 15.35 | 42.94 | 12.01 | 41.47 | 6.44 | 12.9 | 9.75 | 20.79 | 7.97 | 17.79 | 7.33 | 30.83 | 7.35 | 32.34 | 8.12 | 32.71 |
|
103 |
| Code-LLaMA-34B | 0.41 | 19.07 | 0.85 | 16.77 | 0.56 | 18.22 | 1.58 | 13.54 | 2.66 | 17.95 | 2.47 | 16.59 | 9.38 | 35.53 | 11.06 | 37.15 | 8.24 | 33.00 |
|
104 |
+
| CodeT5+-220m | **51.16** | **75.32** | **52.45** | **74.57** | **50.56** | **75.52** | **49.11** | **67.84** | **38.26** | **59.21** | **38.33** | **56.31** | **32.56** | **58.67** | **19.94** | **50.27** | **25.47** | **52.60** |
|
105 |
|
106 |
49.11 67.84 38.26 59.21 38.33 56.3
|
107 |
|
|
|
113 |
| Model | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | BLEU4 | ED | BLEU4 | ED | BLEU4 | ED |
|
114 |
| ChatGPT-3.5-Turbo | 12.08 | 41.39 | 16.77 | 42.02 | 14.73 | 43.72 | 9.80 | 21.86 | 10.81 | 20.66 | 11.39 | 22.82 | 9.24 | 32.13 | 11.96 | 35.33 | 10.07 | 32.90 |
|
115 |
| Code-LLaMA-34B | 0.45 | 17.61 | 0.61 | 17.21 | 0.99 | 17.23 | 1.75 | 15.04 | 0.42 | 11.27 | 2.42 | 16.25 | 6.92 | 32.54 | 8.95 | 38.22 | 8.20 | 34.16 |
|
116 |
+
| CodeT5+-220m | **62.68** | **82.02** | **71.34** | **85.98** | **64.45** | **81.53** | **48.71** | **68.95** | **58.68** | **74.57** | **47.81** | **65.5** | **50.34** | **72.98** | **55.38** | **74.41** | **66.36** | **44.33** |
|
117 |
|
118 |
|
119 |
|