MERA-evaluation commited on
Commit
3c1d352
·
verified ·
1 Parent(s): 9514cc6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -1
README.md CHANGED
@@ -41,6 +41,14 @@ configs:
41
  path: data/shots-*
42
  - split: test
43
  path: data/test-*
 
 
 
 
 
 
 
 
44
  ---
45
 
46
 
@@ -136,4 +144,4 @@ To create RuCLEVR, we used two strategies: 1) generation of the new samples and
136
 
137
  Metrics for aggregated evaluation of responses:
138
 
139
- - `Exact match`: Exact match is the average of scores for all processed cases, where a given case score is 1 if the predicted string is the exact same as its reference string, and is 0 otherwise.
 
41
  path: data/shots-*
42
  - split: test
43
  path: data/test-*
44
+ license: cc-by-4.0
45
+ task_categories:
46
+ - visual-question-answering
47
+ language:
48
+ - ru
49
+ pretty_name: ruCLEVR
50
+ size_categories:
51
+ - 1K<n<10K
52
  ---
53
 
54
 
 
144
 
145
  Metrics for aggregated evaluation of responses:
146
 
147
+ - `Exact match`: Exact match is the average of scores for all processed cases, where a given case score is 1 if the predicted string is the exact same as its reference string, and is 0 otherwise.