sha
stringlengths 40
40
| text
stringlengths 0
13.4M
| id
stringlengths 2
117
| tags
sequence | created_at
stringlengths 25
25
| metadata
stringlengths 2
31.7M
| last_modified
stringlengths 25
25
|
---|---|---|---|---|---|---|
9fac5742b557246d7e57e19e9164e4ee3270fc9e | Recag/Rp_CommonC_678_2 | [
"region:us"
] | 2024-02-16T13:49:21+00:00 | {} | 2024-02-16T13:56:44+00:00 |
|
ce21ec9db446da096433384168fc72e9146755c1 | Recag/Rp_CommonC_679_1 | [
"region:us"
] | 2024-02-16T13:50:14+00:00 | {} | 2024-02-16T14:07:18+00:00 |
|
1e5c9468c1629ecf109ece80e2738cc7d293a0e8 | Recag/Rp_CommonC_679_2 | [
"region:us"
] | 2024-02-16T13:50:20+00:00 | {} | 2024-02-16T13:59:57+00:00 |
|
7f1b914158971430c66b31c6347507d226653cdb | bastistrauss/DE_Plain | [
"language:de",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-02-16T13:51:06+00:00 | {"language": ["de", "en"], "license": "apache-2.0"} | 2024-02-16T13:52:54+00:00 |
|
a01a660bcdff4a356a8a5fc7acdbd7df261f56e4 | Recag/Rp_CommonC_680_1 | [
"region:us"
] | 2024-02-16T13:52:51+00:00 | {} | 2024-02-16T14:08:34+00:00 |
|
0a6429be4b3d45c3d4ae4350f955082a03800505 | Recag/Rp_CommonC_680_2 | [
"region:us"
] | 2024-02-16T13:52:59+00:00 | {} | 2024-02-16T14:00:53+00:00 |
|
55322e4e72671e0cb69e8e96ef4332858b61cdcd | meabhishekkumar/dataset_NTD3 | [
"region:us"
] | 2024-02-16T13:57:20+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "caption", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 690120.0, "num_examples": 2}], "download_size": 690319, "dataset_size": 690120.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T14:26:50+00:00 |
|
3d22e6c89f1e1a643e0276cec4733f44f2b83ebb | Recag/Rp_CommonC_681_1 | [
"region:us"
] | 2024-02-16T13:59:16+00:00 | {} | 2024-02-16T14:20:21+00:00 |
|
c470426f188edd5bd515c2fd781d1f338259c726 | Recag/Rp_CommonC_681_2 | [
"region:us"
] | 2024-02-16T13:59:28+00:00 | {} | 2024-02-16T14:15:26+00:00 |
|
0b9e7d3d254a2ef83716862f6deb2d48c048fc2e |
## Dataset Description
Your dataset description goes here. Describe what the dataset contains, its purpose, and any relevant information about its creation and usage.
## Data Format
Each entry in the dataset consists of two fields:
- `audio_path`: The path to the audio file.
- `description`: The written description corresponding to the audio file.
## Example
Here's an example entry from the dataset:
```json
{
"audio_path": "/path/to/audio/file.mp3",
"description": "A written description of the audio content."
} | lucasjca/ProcedimentosSUS | [
"license:apache-2.0",
"region:us"
] | 2024-02-16T14:00:13+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "audio", "dtype": "audio"}], "splits": [{"name": "train", "num_bytes": 41920700.0, "num_examples": 89}], "download_size": 41553598, "dataset_size": 41920700.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T18:25:29+00:00 |
3c243f0767855b0bdc30612c92f3e436aba2aefd | Recag/Rp_CommonC_682_1 | [
"region:us"
] | 2024-02-16T14:00:57+00:00 | {} | 2024-02-16T14:23:21+00:00 |
|
925607d552e618761ab665d325d1957dec9cd15b | Recag/Rp_CommonC_682_2 | [
"region:us"
] | 2024-02-16T14:01:03+00:00 | {} | 2024-02-16T14:22:55+00:00 |
|
e434e06588ae64727cfeeee43b21468f15d09980 | sajib09/badd_dataset | [
"region:us"
] | 2024-02-16T14:06:12+00:00 | {} | 2024-02-16T14:06:12+00:00 |
|
763c02ee5913c20a387f15d6f58cf344f184d4f2 | marianna13/openhermes-2.5-webdataset | [
"region:us"
] | 2024-02-16T14:06:29+00:00 | {} | 2024-02-16T14:52:00+00:00 |
|
c5e8db524c6eecdc4dfbe1d406925a5134ae387f | ygtozclk97/mini-platypus-two | [
"region:us"
] | 2024-02-16T14:06:42+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4186564, "num_examples": 1000}], "download_size": 2245921, "dataset_size": 4186564}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T14:06:44+00:00 |
|
5c87817e72776c426053ef9c0609553126fdbe2b | romain22222/pokemon-captions | [
"region:us"
] | 2024-02-16T14:09:40+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "caption", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 166725933.125, "num_examples": 1271}], "download_size": 163282284, "dataset_size": 166725933.125}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T14:52:00+00:00 |
|
c2dabf9a8a0e0f303e1f448162bc5dd701fcb00b | HuggingFaceTB/stories_openhermes_merged_decont_report | [
"region:us"
] | 2024-02-16T14:09:49+00:00 | {"dataset_info": {"features": [{"name": "completion", "dtype": "string"}, {"name": "dedup_ngram", "sequence": "string"}, {"name": "dedup_bench_name", "dtype": "string"}, {"name": "dedup_bench_text", "dtype": "string"}, {"name": "dedup_diff", "sequence": "string"}, {"name": "dedup_diff_ratio", "dtype": "float64"}, {"name": "dedup_diff_length", "dtype": "int64"}, {"name": "dedup_longest_diff_part", "dtype": "string"}, {"name": "dedup_longest_diff_part_length", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 2470852, "num_examples": 706}], "download_size": 1329398, "dataset_size": 2470852}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T14:09:49+00:00 |
|
36789386a374c15c303b893fb0cdbf515ab13eec |
# Dataset Card for Evaluation run of aloobun/Reyna-Mini-1.8B-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aloobun/Reyna-Mini-1.8B-v0.2](https://huggingface.co/aloobun/Reyna-Mini-1.8B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aloobun__Reyna-Mini-1.8B-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T14:08:02.040553](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__Reyna-Mini-1.8B-v0.2/blob/main/results_2024-02-16T14-08-02.040553.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.44626133629619946,
"acc_stderr": 0.034425038440545834,
"acc_norm": 0.4487075344673435,
"acc_norm_stderr": 0.03514635225565978,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476196,
"mc2": 0.4123912546191622,
"mc2_stderr": 0.013946155781929979
},
"harness|arc:challenge|25": {
"acc": 0.34044368600682595,
"acc_stderr": 0.01384746051889298,
"acc_norm": 0.3660409556313993,
"acc_norm_stderr": 0.014077223108470144
},
"harness|hellaswag|10": {
"acc": 0.44692292372037445,
"acc_stderr": 0.004961587574275628,
"acc_norm": 0.6018721370244972,
"acc_norm_stderr": 0.00488511646555027
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47547169811320755,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.47547169811320755,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4097222222222222,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.4097222222222222,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237103,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237103
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.037786210790920545,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.037786210790920545
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.032436186361081004,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.032436186361081004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028417,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03718489006818115,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03718489006818115
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.47096774193548385,
"acc_stderr": 0.028396016402761,
"acc_norm": 0.47096774193548385,
"acc_norm_stderr": 0.028396016402761
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.033554009049695646,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.033554009049695646
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.03804913653971012,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.03804913653971012
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.035402943770953675,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.035402943770953675
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.538860103626943,
"acc_stderr": 0.035975244117345775,
"acc_norm": 0.538860103626943,
"acc_norm_stderr": 0.035975244117345775
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33589743589743587,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.33589743589743587,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008937,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008937
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5596330275229358,
"acc_stderr": 0.02128431062376155,
"acc_norm": 0.5596330275229358,
"acc_norm_stderr": 0.02128431062376155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.030546745264953178,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.030546745264953178
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.03476099060501636,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.03476099060501636
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5991561181434599,
"acc_stderr": 0.031900803894732356,
"acc_norm": 0.5991561181434599,
"acc_norm_stderr": 0.031900803894732356
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5112107623318386,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.5112107623318386,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5190839694656488,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.5190839694656488,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4110429447852761,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.4110429447852761,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.029058588303748845,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.029058588303748845
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5900383141762452,
"acc_stderr": 0.017587672312336048,
"acc_norm": 0.5900383141762452,
"acc_norm_stderr": 0.017587672312336048
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5289017341040463,
"acc_stderr": 0.02687408588351835,
"acc_norm": 0.5289017341040463,
"acc_norm_stderr": 0.02687408588351835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.01442229220480884,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.01442229220480884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.02843109544417664,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.02843109544417664
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.43086816720257237,
"acc_stderr": 0.028125340983972714,
"acc_norm": 0.43086816720257237,
"acc_norm_stderr": 0.028125340983972714
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.027777777777777804,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.027777777777777804
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028121636040639882,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028121636040639882
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3468057366362451,
"acc_stderr": 0.012156071332318713,
"acc_norm": 0.3468057366362451,
"acc_norm_stderr": 0.012156071332318713
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3272058823529412,
"acc_stderr": 0.02850145286039655,
"acc_norm": 0.3272058823529412,
"acc_norm_stderr": 0.02850145286039655
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42483660130718953,
"acc_stderr": 0.019997973035458336,
"acc_norm": 0.42483660130718953,
"acc_norm_stderr": 0.019997973035458336
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42448979591836733,
"acc_stderr": 0.031642094879429414,
"acc_norm": 0.42448979591836733,
"acc_norm_stderr": 0.031642094879429414
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5970149253731343,
"acc_stderr": 0.034683432951111266,
"acc_norm": 0.5970149253731343,
"acc_norm_stderr": 0.034683432951111266
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.038057975055904594,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.038057975055904594
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476196,
"mc2": 0.4123912546191622,
"mc2_stderr": 0.013946155781929979
},
"harness|winogrande|5": {
"acc": 0.6156274664561957,
"acc_stderr": 0.013671567600836196
},
"harness|gsm8k|5": {
"acc": 0.31311599696739956,
"acc_stderr": 0.01277428566938509
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_aloobun__Reyna-Mini-1.8B-v0.2 | [
"region:us"
] | 2024-02-16T14:10:07+00:00 | {"pretty_name": "Evaluation run of aloobun/Reyna-Mini-1.8B-v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [aloobun/Reyna-Mini-1.8B-v0.2](https://huggingface.co/aloobun/Reyna-Mini-1.8B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aloobun__Reyna-Mini-1.8B-v0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T14:08:02.040553](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__Reyna-Mini-1.8B-v0.2/blob/main/results_2024-02-16T14-08-02.040553.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44626133629619946,\n \"acc_stderr\": 0.034425038440545834,\n \"acc_norm\": 0.4487075344673435,\n \"acc_norm_stderr\": 0.03514635225565978,\n \"mc1\": 0.2582619339045288,\n \"mc1_stderr\": 0.015321821688476196,\n \"mc2\": 0.4123912546191622,\n \"mc2_stderr\": 0.013946155781929979\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.34044368600682595,\n \"acc_stderr\": 0.01384746051889298,\n \"acc_norm\": 0.3660409556313993,\n \"acc_norm_stderr\": 0.014077223108470144\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44692292372037445,\n \"acc_stderr\": 0.004961587574275628,\n \"acc_norm\": 0.6018721370244972,\n \"acc_norm_stderr\": 0.00488511646555027\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.04033565667848319,\n \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.04033565667848319\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.47547169811320755,\n \"acc_stderr\": 0.030735822206205608,\n \"acc_norm\": 0.47547169811320755,\n \"acc_norm_stderr\": 0.030735822206205608\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4097222222222222,\n \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.4097222222222222,\n \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237103,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237103\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n \"acc_stderr\": 0.037786210790920545,\n \"acc_norm\": 0.43352601156069365,\n \"acc_norm_stderr\": 0.037786210790920545\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.032436186361081004,\n \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.032436186361081004\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028417,\n \"acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03718489006818115,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03718489006818115\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.47096774193548385,\n \"acc_stderr\": 0.028396016402761,\n \"acc_norm\": 0.47096774193548385,\n \"acc_norm_stderr\": 0.028396016402761\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.033554009049695646,\n \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.033554009049695646\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.03804913653971012,\n \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.03804913653971012\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.035402943770953675,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.035402943770953675\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.538860103626943,\n \"acc_stderr\": 0.035975244117345775,\n \"acc_norm\": 0.538860103626943,\n \"acc_norm_stderr\": 0.035975244117345775\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.33589743589743587,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.33589743589743587,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.03196876989195778,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.03196876989195778\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008937,\n \"acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008937\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5596330275229358,\n \"acc_stderr\": 0.02128431062376155,\n \"acc_norm\": 0.5596330275229358,\n \"acc_norm_stderr\": 0.02128431062376155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.030546745264953178,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.030546745264953178\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.03476099060501636,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.03476099060501636\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5991561181434599,\n \"acc_stderr\": 0.031900803894732356,\n \"acc_norm\": 0.5991561181434599,\n \"acc_norm_stderr\": 0.031900803894732356\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5112107623318386,\n \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.5112107623318386,\n \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5190839694656488,\n \"acc_stderr\": 0.04382094705550988,\n \"acc_norm\": 0.5190839694656488,\n \"acc_norm_stderr\": 0.04382094705550988\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4110429447852761,\n \"acc_stderr\": 0.038656978537853624,\n \"acc_norm\": 0.4110429447852761,\n \"acc_norm_stderr\": 0.038656978537853624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7307692307692307,\n \"acc_stderr\": 0.029058588303748845,\n \"acc_norm\": 0.7307692307692307,\n \"acc_norm_stderr\": 0.029058588303748845\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5900383141762452,\n \"acc_stderr\": 0.017587672312336048,\n \"acc_norm\": 0.5900383141762452,\n \"acc_norm_stderr\": 0.017587672312336048\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5289017341040463,\n \"acc_stderr\": 0.02687408588351835,\n \"acc_norm\": 0.5289017341040463,\n \"acc_norm_stderr\": 0.02687408588351835\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.01442229220480884,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.01442229220480884\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.02843109544417664,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.02843109544417664\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.43086816720257237,\n \"acc_stderr\": 0.028125340983972714,\n \"acc_norm\": 0.43086816720257237,\n \"acc_norm_stderr\": 0.028125340983972714\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.027777777777777804,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.027777777777777804\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028121636040639882,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028121636040639882\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3468057366362451,\n \"acc_stderr\": 0.012156071332318713,\n \"acc_norm\": 0.3468057366362451,\n \"acc_norm_stderr\": 0.012156071332318713\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3272058823529412,\n \"acc_stderr\": 0.02850145286039655,\n \"acc_norm\": 0.3272058823529412,\n \"acc_norm_stderr\": 0.02850145286039655\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.42483660130718953,\n \"acc_stderr\": 0.019997973035458336,\n \"acc_norm\": 0.42483660130718953,\n \"acc_norm_stderr\": 0.019997973035458336\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.42448979591836733,\n \"acc_stderr\": 0.031642094879429414,\n \"acc_norm\": 0.42448979591836733,\n \"acc_norm_stderr\": 0.031642094879429414\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5970149253731343,\n \"acc_stderr\": 0.034683432951111266,\n \"acc_norm\": 0.5970149253731343,\n \"acc_norm_stderr\": 0.034683432951111266\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5614035087719298,\n \"acc_stderr\": 0.038057975055904594,\n \"acc_norm\": 0.5614035087719298,\n \"acc_norm_stderr\": 0.038057975055904594\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n \"mc1_stderr\": 0.015321821688476196,\n \"mc2\": 0.4123912546191622,\n \"mc2_stderr\": 0.013946155781929979\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6156274664561957,\n \"acc_stderr\": 0.013671567600836196\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.31311599696739956,\n \"acc_stderr\": 0.01277428566938509\n }\n}\n```", "repo_url": "https://huggingface.co/aloobun/Reyna-Mini-1.8B-v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-08-02.040553.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["**/details_harness|winogrande|5_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T14-08-02.040553.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T14_08_02.040553", "path": ["results_2024-02-16T14-08-02.040553.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T14-08-02.040553.parquet"]}]}]} | 2024-02-16T14:10:32+00:00 |
0c321465e6cb4b24dde374e72228100b0fe012b3 | # Dataset Card for "reformat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Drewskidang/reformat | [
"region:us"
] | 2024-02-16T14:10:52+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5719386, "num_examples": 21257}], "download_size": 2490392, "dataset_size": 5719386}} | 2024-02-16T14:10:54+00:00 |
467cf4bd133e9b5d4a6a87227b42267811001e7e |
# Dataset Card for Evaluation run of NovoCode/Tiger-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NovoCode/Tiger-DPO](https://huggingface.co/NovoCode/Tiger-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NovoCode__Tiger-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T14:49:32.301206](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Tiger-DPO/blob/main/results_2024-02-16T14-49-32.301206.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5958020167795036,
"acc_stderr": 0.03336424247295358,
"acc_norm": 0.6002192767446106,
"acc_norm_stderr": 0.03404810463176771,
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454604,
"mc2": 0.507583773070451,
"mc2_stderr": 0.014770823381787772
},
"harness|arc:challenge|25": {
"acc": 0.43686006825938567,
"acc_stderr": 0.014494421584256525,
"acc_norm": 0.48208191126279865,
"acc_norm_stderr": 0.01460200558549098
},
"harness|hellaswag|10": {
"acc": 0.620991834295957,
"acc_stderr": 0.004841486716855774,
"acc_norm": 0.8181637124078869,
"acc_norm_stderr": 0.0038492126228151665
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.040166600304512336,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.040166600304512336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601677,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601677
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724352,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724352
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624335,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624335
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.02498535492310234,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.02498535492310234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857413,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857413
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.01738141556360868,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.01738141556360868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847837,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847837
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073318,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073318
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.01471168438613996,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.01471168438613996
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.02519018132760842,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.02519018132760842
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38994413407821227,
"acc_stderr": 0.01631237662921307,
"acc_norm": 0.38994413407821227,
"acc_norm_stderr": 0.01631237662921307
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906497,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906497
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.026730620728004913,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.026730620728004913
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.02628973494595293,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.02628973494595293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4132985658409387,
"acc_stderr": 0.012576779494860087,
"acc_norm": 0.4132985658409387,
"acc_norm_stderr": 0.012576779494860087
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.029349803139765873,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.029349803139765873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.019873802005061173,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.019873802005061173
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454604,
"mc2": 0.507583773070451,
"mc2_stderr": 0.014770823381787772
},
"harness|winogrande|5": {
"acc": 0.7632202052091555,
"acc_stderr": 0.011947592365207397
},
"harness|gsm8k|5": {
"acc": 0.41015921152388174,
"acc_stderr": 0.013548335117860353
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NovoCode__Tiger-DPO | [
"region:us"
] | 2024-02-16T14:11:00+00:00 | {"pretty_name": "Evaluation run of NovoCode/Tiger-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [NovoCode/Tiger-DPO](https://huggingface.co/NovoCode/Tiger-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NovoCode__Tiger-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T14:49:32.301206](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Tiger-DPO/blob/main/results_2024-02-16T14-49-32.301206.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5958020167795036,\n \"acc_stderr\": 0.03336424247295358,\n \"acc_norm\": 0.6002192767446106,\n \"acc_norm_stderr\": 0.03404810463176771,\n \"mc1\": 0.32313341493268055,\n \"mc1_stderr\": 0.016371836286454604,\n \"mc2\": 0.507583773070451,\n \"mc2_stderr\": 0.014770823381787772\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.43686006825938567,\n \"acc_stderr\": 0.014494421584256525,\n \"acc_norm\": 0.48208191126279865,\n \"acc_norm_stderr\": 0.01460200558549098\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.620991834295957,\n \"acc_stderr\": 0.004841486716855774,\n \"acc_norm\": 0.8181637124078869,\n \"acc_norm_stderr\": 0.0038492126228151665\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.040166600304512336,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.040166600304512336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.032685726586674915,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.032685726586674915\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601677,\n \"acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601677\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n \"acc_stderr\": 0.026860206444724352,\n \"acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.026860206444724352\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624335,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624335\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310234,\n \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310234\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857413,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857413\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.031429466378837076,\n \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.031429466378837076\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7926605504587156,\n \"acc_stderr\": 0.01738141556360868,\n \"acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.01738141556360868\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.6278026905829597,\n \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847837,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847837\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073318,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073318\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n \"acc_stderr\": 0.01471168438613996,\n \"acc_norm\": 0.7841634738186463,\n \"acc_norm_stderr\": 0.01471168438613996\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.02519018132760842,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.02519018132760842\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38994413407821227,\n \"acc_stderr\": 0.01631237662921307,\n \"acc_norm\": 0.38994413407821227,\n \"acc_norm_stderr\": 0.01631237662921307\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906497,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906497\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n \"acc_stderr\": 0.026730620728004913,\n \"acc_norm\": 0.6688102893890675,\n \"acc_norm_stderr\": 0.026730620728004913\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.02628973494595293,\n \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.02628973494595293\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4132985658409387,\n \"acc_stderr\": 0.012576779494860087,\n \"acc_norm\": 0.4132985658409387,\n \"acc_norm_stderr\": 0.012576779494860087\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5931372549019608,\n \"acc_stderr\": 0.019873802005061173,\n \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.019873802005061173\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.7860696517412935,\n \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32313341493268055,\n \"mc1_stderr\": 0.016371836286454604,\n \"mc2\": 0.507583773070451,\n \"mc2_stderr\": 0.014770823381787772\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207397\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41015921152388174,\n \"acc_stderr\": 0.013548335117860353\n }\n}\n```", "repo_url": "https://huggingface.co/NovoCode/Tiger-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-08-40.800066.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-49-32.301206.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["**/details_harness|winogrande|5_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["**/details_harness|winogrande|5_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T14-49-32.301206.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T14_08_40.800066", "path": ["results_2024-02-16T14-08-40.800066.parquet"]}, {"split": "2024_02_16T14_49_32.301206", "path": ["results_2024-02-16T14-49-32.301206.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T14-49-32.301206.parquet"]}]}]} | 2024-02-16T14:51:54+00:00 |
b474f28e13621358e854b2d6a4340bc3fbb9b98a | El-chapoo/1000k_AO1 | [
"region:us"
] | 2024-02-16T14:14:12+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8744559, "num_examples": 1}], "download_size": 8079805, "dataset_size": 8744559}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T14:14:17+00:00 |
|
d2fb0045bf6f4b4d160d95f53626d05905a8878e |
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-openhermes-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-openhermes-sft](https://huggingface.co/CorticalStack/mistral-7b-openhermes-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__mistral-7b-openhermes-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T14:13:04.061725](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-openhermes-sft/blob/main/results_2024-02-16T14-13-04.061725.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6070598204374664,
"acc_stderr": 0.03297690039129263,
"acc_norm": 0.6130046390646828,
"acc_norm_stderr": 0.033660154914381686,
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179185,
"mc2": 0.4630793817398098,
"mc2_stderr": 0.014741207245405565
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.01443413871337998,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.01428052266746732
},
"harness|hellaswag|10": {
"acc": 0.6233817964548894,
"acc_stderr": 0.004835475957610925,
"acc_norm": 0.8200557657837084,
"acc_norm_stderr": 0.003833559228158668
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067884,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.02582210611941589,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.02582210611941589
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110943,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110943
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.03191863374478466,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.03191863374478466
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7779816513761468,
"acc_stderr": 0.017818849564796634,
"acc_norm": 0.7779816513761468,
"acc_norm_stderr": 0.017818849564796634
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145624,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374984,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.01424887354921758,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.01424887354921758
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879716,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010073,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900922,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291463,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291463
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567654,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567654
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681393,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681393
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.01965992249362335,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.01965992249362335
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179185,
"mc2": 0.4630793817398098,
"mc2_stderr": 0.014741207245405565
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774104
},
"harness|gsm8k|5": {
"acc": 0.312357846853677,
"acc_stderr": 0.012765850404191427
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CorticalStack__mistral-7b-openhermes-sft | [
"region:us"
] | 2024-02-16T14:15:25+00:00 | {"pretty_name": "Evaluation run of CorticalStack/mistral-7b-openhermes-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-openhermes-sft](https://huggingface.co/CorticalStack/mistral-7b-openhermes-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__mistral-7b-openhermes-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T14:13:04.061725](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-openhermes-sft/blob/main/results_2024-02-16T14-13-04.061725.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6070598204374664,\n \"acc_stderr\": 0.03297690039129263,\n \"acc_norm\": 0.6130046390646828,\n \"acc_norm_stderr\": 0.033660154914381686,\n \"mc1\": 0.31456548347613217,\n \"mc1_stderr\": 0.016255241993179185,\n \"mc2\": 0.4630793817398098,\n \"mc2_stderr\": 0.014741207245405565\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.01443413871337998,\n \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.01428052266746732\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6233817964548894,\n \"acc_stderr\": 0.004835475957610925,\n \"acc_norm\": 0.8200557657837084,\n \"acc_norm_stderr\": 0.003833559228158668\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n \"acc_stderr\": 0.02582210611941589,\n \"acc_norm\": 0.7096774193548387,\n \"acc_norm_stderr\": 0.02582210611941589\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959215,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959215\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110943,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110943\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.03191863374478466,\n \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.03191863374478466\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7779816513761468,\n \"acc_stderr\": 0.017818849564796634,\n \"acc_norm\": 0.7779816513761468,\n \"acc_norm_stderr\": 0.017818849564796634\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145624,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145624\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.04453197507374984,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.04453197507374984\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n \"acc_stderr\": 0.01424887354921758,\n \"acc_norm\": 0.8020434227330779,\n \"acc_norm_stderr\": 0.01424887354921758\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879716,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879716\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n \"acc_stderr\": 0.014854993938010073,\n \"acc_norm\": 0.27039106145251396,\n \"acc_norm_stderr\": 0.014854993938010073\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900922,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900922\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291463,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291463\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n \"acc_stderr\": 0.012671902782567654,\n \"acc_norm\": 0.4380704041720991,\n \"acc_norm_stderr\": 0.012671902782567654\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681393,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681393\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.01965992249362335,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.01965992249362335\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31456548347613217,\n \"mc1_stderr\": 0.016255241993179185,\n \"mc2\": 0.4630793817398098,\n \"mc2_stderr\": 0.014741207245405565\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774104\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.312357846853677,\n \"acc_stderr\": 0.012765850404191427\n }\n}\n```", "repo_url": "https://huggingface.co/CorticalStack/mistral-7b-openhermes-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-13-04.061725.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["**/details_harness|winogrande|5_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T14-13-04.061725.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T14_13_04.061725", "path": ["results_2024-02-16T14-13-04.061725.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T14-13-04.061725.parquet"]}]}]} | 2024-02-16T14:15:48+00:00 |
020e8bab690a9e18ff685dd17606553d31554371 | anaili/title_BREAST_embdeddings | [
"license:apache-2.0",
"region:us"
] | 2024-02-16T14:15:26+00:00 | {"license": "apache-2.0"} | 2024-02-16T14:16:10+00:00 |
|
7eadbd8948d5b6ddda5948c349994bfa2294c1ef |
## The annotations are in the **annotations_coco.json** file | saberder/humpback | [
"task_categories:object-detection",
"task_categories:image-classification",
"size_categories:1K<n<10K",
"language:en",
"biology",
"region:us"
] | 2024-02-16T14:15:58+00:00 | {"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["object-detection", "image-classification"], "pretty_name": "Humpback", "tags": ["biology"]} | 2024-02-16T16:09:40+00:00 |
93b675daa2bd4cc110a3c18ecf722bc2e7868c6a | # Task Name
- **FLAN-2021 -> 70**
```json
{
"ag_news_subset": null,
"ai2_arc_ARC-Challenge": null,
"ai2_arc_ARC-Easy": null,
"aeslc": null,
"anli_r1": null,
"anli_r2": null,
"anli_r3": null,
"bool_q": null,
"cnn_dailymail": null,
"coqa": null,
"cosmos_qa": null,
"definite_pronoun_resolution": null,
"drop": null,
"fix_punct": null,
"gem_common_gen": null,
"gem_dart": null,
"gem_e2e_nlg": null,
"gem_web_nlg_en": null,
"gem_wiki_lingua_english_en": null,
"gigaword": null,
"glue_cola": null,
"glue_mnli": null,
"glue_mrpc": null,
"glue_qnli": null,
"glue_qqp": null,
"glue_sst2": null,
"glue_stsb": null,
"glue_wnli": null,
"hellaswag": null,
"huggingface_xsum": null,
"imdb_reviews_plain_text": null,
"lambada": null,
"math_dataset_algebra__linear_1d": null,
"multi_news": null,
"natural_questions_open": null,
"newsroom": null,
"openbookqa": null,
"opinion_abstracts_idebate": null,
"opinion_abstracts_rotten_tomatoes": null,
"para_crawl_enes": null,
"paws_wiki": null,
"piqa": null,
"quac": null,
"samsum": null,
"sentiment140": null,
"snli": null,
"squad_v1_1": null,
"squad_v2_0": null,
"story_cloze_2016": null,
"super_glue_cb": null,
"super_glue_copa": null,
"super_glue_multirc": null,
"super_glue_record": null,
"super_glue_rte": null,
"super_glue_wic": null,
"super_glue_wsc_fixed": null,
"trec": null,
"trivia_qa_rc": null,
"true_case": null,
"unified_qa_science_inst": null,
"winogrande": null,
"word_segment": null,
"wmt14_translate_fr-en": null,
"wmt16_translate_cs-en": null,
"wmt16_translate_de-en": null,
"wmt16_translate_fi-en": null,
"wmt16_translate_ro-en": null,
"wmt16_translate_ru-en": null,
"wmt16_translate_tr-en": null,
"yelp_polarity_reviews": null
}
``` | aslawliet/flan_zeroshot | [
"task_categories:text-generation",
"task_categories:text-classification",
"task_categories:token-classification",
"task_categories:question-answering",
"task_categories:zero-shot-classification",
"task_categories:translation",
"task_categories:summarization",
"size_categories:10M<n<100M",
"language:en",
"license:cc-by-4.0",
"region:us"
] | 2024-02-16T14:17:08+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["10M<n<100M"], "task_categories": ["text-generation", "text-classification", "token-classification", "question-answering", "zero-shot-classification", "translation", "summarization"]} | 2024-02-16T15:39:32+00:00 |
208390616700865bf6cfc22b0af2f515dc496ccc |
# Dataset Card for OpenHermes-2.5-1k-longest
<!-- Provide a quick summary of the dataset. -->
OpenHermes-2.5-1k-longest is a dataset of 1,000 samples derived from [teknium/OpenHermes-2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5) using the [Long is More for Alignment](https://huggingface.co/papers/2402.04833) protocol. This protocol consists of selecting the 1,000 longest responses and provides a strong baseline to measure performance against. For example, fine-tuning [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on this dataset using similar hyperparameters to those given in the paper produces a chat model that achieves a score of ~7.0 on MT-Bench, which is comparable in performance to training over the full ~1 million examples.
We found that stratified sampling across the subsets of [teknium/OpenHermes-2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5) produced slightly better models than uniform sampling. The result is a dataset with the following proportions across each `source`:
| source | proportion |
|:----------------------|-------------:|
| glaive-code-assist | 0.362 |
| CamelAI | 0.155 |
| metamath | 0.112 |
| EvolInstruct_70k | 0.103 |
| cot_alpaca_gpt4 | 0.083 |
| airoboros2.2 | 0.07 |
| platypus | 0.044 |
| GPT-4 Comparison Data | 0.03 |
| UnnaturalInstructions | 0.017 |
| CogStackMed | 0.009 |
| LMSys Chatbot Arena | 0.006 |
| caseus_custom | 0.005 |
| lmsys1m | 0.003 |
| Econ_domain_expert | 0.001 |
See the [`create_dataset.py`](https://huggingface.co/datasets/HuggingFaceH4/OpenHermes-2.5-1k-longest/blob/main/create_dataset.py) script for details on how the dataset was constructed.
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
OpenHermes-2.5-1k-longest is suitable for training chat models via supervised fine-tuning (SFT). To load the dataset run:
```python
from datasets import load_dataset
from transformers import AutoTokenizer
ds = load_dataset("HuggingFaceH4/OpenHermes-2.5-1k-longest")
# Load a tokenizer and apply chat template
tokenizer = AutoTokenizer.from_pretrained("teknium/OpenHermes-2.5-Mistral-7B")
example = ds["train_sft"][0]
formatted_example = tokenizer.apply_chat_template(example["messages"], tokenize=False)
print(formatted_example)
```
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
Each example has the following structure:
```json
{
"source": "glaive-code-assist",
"category": None,
"messages": [
{
"content": "How can I copy code to StationA and StationB and set the hostnames for each station in the code?",
"role": "user",
},
{
"content": 'To copy code to StationA and StationB and set the hostnames, you can use the following code:\n\n```python\nfrom __future__ import with_statement, print_function\nfrom fabric.api import local, settings, abort, run, sudo, cd, hosts, env, execute\nfrom fabric.contrib.console import confirm\nfrom fabric.operations import put, get\nfrom fabric.contrib.project import rsync_project\n\nimport re\nimport subprocess as sp \nimport os.path \nfrom StringIO import StringIO\n\n\ncurrent_dir = os.path.dirname(os.path.realpath(__file__))\n\n\nVNC_LICENSE = [\n "xxxxx-xxxxx-xxxxx-xxxxx-xxxxx"\n]\n\n\n# TODO: Put a proper deployment mechanism here.\nenv.key_filename = \'/home/alcides/.ssh/zunzun_ec2_keypair_0.pem\'\n\nStationA_H = \'[email protected]\'\nStationB_H = \'[email protected]\'\nBeefy_H = \'[email protected]\'\n# TODO: Make the IP number below deployment-specific...\nBeefy_InternalIP = \'192.168.112.131\'\nStationA_InternalIP = \'192.168.112.129\'\nStationB_InternalIP = \'192.168.112.130\'\nHomeDir_Name = "ubuntu"\n\n\n@hosts(StationA_H)\ndef StationA():\n """\n Copies code to StationA \n """\n rsync_project(\n local_dir = "scripts/StationA",\n remote_dir = ("/home/{HomeDir_Name}/".format(HomeDir_Name=HomeDir_Name))\n )\n run("ln -sf /home/{HomeDir_Name}/StationA/onstartup.py /home/{HomeDir_Name}/onstartup.py".format(HomeDir_Name=HomeDir_Name))\n\n\n@hosts(StationB_H)\ndef setup_dns_masq():\n sudo("apt-get install -y dnsmasq")\n put(\n local_path = StringIO("addn-hosts=/home/{HomeDir_Name}/dnsmasq_more.conf\\n".format(HomeDir_Name=HomeDir_Name)),\n remote_path = "/etc/dnsmasq.conf",\n use_sudo=True)\n\n\n@hosts(StationB_H)\ndef StationB():\n """\n Copies both the chrome plugin and the DNSMasq watcher \n """\n rsync_project(\n local_dir = "scripts/StationB",\n remote_dir = ("/home/{HomeDir_Name}/".format(HomeDir_Name=HomeDir_Name))\n )\n rsync_project(\n local_dir = "scripts/StationA/chrome_captures_hars",\n remote_dir = (("/home/{HomeDir_Name}/StationB/".format(HomeDir_Name=HomeDir_Name)).format(HomeDir_Name=HomeDir_Name))\n )\n run("ln -sf /home/{HomeDir_Name}/StationB/onstartup.py /home/{HomeDir_Name}/onstartup.py".format(HomeDir_Name=HomeDir_Name))\n\n\n@hosts(StationB_H)\ndef install_updatednsmasq_service():\n with settings(warn_only=True):\n sudo("service updatednsmasq stop")\n put(\n local_path = "scripts/StationB/configure_dnsmasq.py",\n remote_path = "/home/{HomeDir_Name}/StationB/configure_dnsmasq.py".format(HomeDir_Name=HomeDir_Name) ,\n use_sudo = True\n )\n put(\n local_path = StringIO("""\ndescription "Update dnsmasq"\n\nstart on runlevel [2345]\nstop on runlevel [!2345]\n\numask 022\n\nconsole log\n\nenv PATH=/opt/openssl-1.0.2/bin/:/usr/bin:/usr/local/bin:/usr/sbin:/bin \nexport PATH\nenv LD_LIBRARY_PATH=/opt/openssl-1.0.2/lib\nexport LD_LIBRARY_PATH\nenv USER={HomeDir_Name}\nexport USER\n\nscript \n exec /usr/bin/python /home/{HomeDir_Name}/StationB/configure_dnsmasq.py\nend script\n\n""".format(HomeDir_Name=HomeDir_Name)),\n remote_path = "/etc/init/updatednsmasq.conf",\n use_sudo=True )\n sudo("service updatednsmasq start")\n\n\n@hosts(Beefy_H)\ndef Beefy():\n sudo("apt-get update")\n sudo("apt-get -y install libgmp-dev")\n\n\n@hosts(Beefy_H)\ndef BeefyRehMimic():\n with settings(warn_only=True):\n sudo("service mimic stop")\n put(\n local_path = "dist/build/reh-mimic/reh-mimic",\n remote_path = "/home/{HomeDir_Name}/reh-mimic".format(HomeDir_Name=HomeDir_Name)\n )\n run("chmod ugo+x /home/{HomeDir_Name}/reh-mimic".format(HomeDir_Name=HomeDir_Name))\n sudo("rm /home/{HomeDir_Name}/mimic -rf".format(HomeDir_Name=HomeDir_Name) )\n rsync_project(\n local_dir = "mimic",\n remote_dir = "/home/{HomeDir_Name}/".format(HomeDir_Name=HomeDir_Name),\n )\n put(\n local_path = "scripts/mimic.conf",\n remote_path = "/etc/init/mimic.conf",\n use_sudo = True\n )\n sudo("touch /root/.rnd")\n sudo("service mimic start")\n\n\n@hosts(Beefy_H, StationA_H, StationB_H )\ndef configure_logging():\n if env.host_string == StationA_H:\n put(\n local_path = StringIO("""$template Logentries,"199fb2e1-8227-4f73-9150-70a34a5d5e0c %HOSTNAME% %syslogtag%%msg%\\\\n"\n*.* @@api.logentries.com:10000;Logentries"""),\n remote_path = "/etc/rsyslog.d/70-logentries.conf",\n use_sudo = True )\n elif env.host_string == StationB_H:\n put(\n local_path = StringIO("""$template Logentries,"3d2fd756-407a-4764-b130-1dd6f22a1b62 %HOSTNAME% %syslogtag%%msg%\\\\n"\n*.* @@api.logentries.com:10000;Logentries"""),\n remote_path = "/etc/rsyslog.d/70-logentries.conf",\n use_sudo = True )\n else:\n put(\n local_path = StringIO("""$template Logentries,"7551d4e0-fa76-466f-8547-8c9a347a9363 %HOSTNAME% %syslogtag%%msg%\\\\n"\n*.* @@api.logentries.com:10000;Logentries"""),\n remote_path = "/etc/rsyslog.d/70-logentries.conf",\n use_sudo = True )\n \n sudo("service rsyslog restart")\n # Check logging works...\n sudo("logger -t test Hello there Logentries")\n\n\n@hosts (StationA_H, StationB_H)\ndef deploy_specific():\n if env.host_string == StationA_H:\n print("StationA deploy")\n StationA()\n elif env.host_string == StationB_H:\n print("StationB deploy")\n StationB()\n else: \n print("Beefy station deploy")\n Beefy()\n\n\n@hosts(StationA_H, StationB_H)\ndef apt_stations():\n sudo("apt-get update")\n sudo("apt-get install -y xutils xbase-clients xfonts-base xfonts-75dpi xfonts-100dpi")\n sudo("apt-get install -y python-pip")\n sudo("apt-get install -y xdotool")\n sudo("apt-get install -y xfwm4") \n\n\n@hosts(StationA_H, StationB_H)\ndef pythonlibs():\n sudo("pip install python-daemon>=2.0")\n sudo("pip install raven")\n\n\n@hosts(Beefy_H, StationA_H, StationB_H)\ndef ssl():\n """\n Copies Openssl and curl to the target hosts...\n """\n sudo("mkdir -p /opt/openssl-1.0.2/")\n sudo(("chown {HomeDir_Name} /opt/openssl-1.0.2/".format(HomeDir_Name=HomeDir_Name)))\n rsync_project(\n local_dir = "/opt/openssl-1.0.2",\n remote_dir = "/opt/",\n extra_opts="-avz"\n )\n put(\n local_path = "scripts/ca-certificates.crt",\n remote_path = "/etc/ssl/certs/ca-certificates.crt",\n use_sudo = True\n )\n\n\n@hosts(Beefy_H, StationA_H, StationB_H)\ndef ca():\n """\n Copies the ca certificate to the home...\n """\n put(\n local_path = "mimic-here/config/ca/cacert.pem",\n remote_path = ("/home/{HomeDir_Name}/cacert.pem".format(HomeDir_Name=HomeDir_Name)),\n use_sudo = True\n )\n\n\n@hosts(StationA_H, StationB_H)\ndef install_vnc():\n """\n \n """\n # run("curl -L -o VNC.tar.gz https://www.realvnc.com/download/binary/1720/")\n # run("tar xvf VNC-5.2.3-Linux-x64-ANY.tar.gz")\n\n\n put(\n local_path = "scripts/VNC-5.2.3-Linux-x64-ANY.tar.gz",\n remote_path = ("/home/{HomeDir_Name}/VNC-5.2.3-Linux-x64-ANY.tar.gz".format(HomeDir_Name=HomeDir_Name)),\n use_sudo = True\n )\n run(("tar -xzf /home/{HomeDir_Name}/VNC-5.2.3-Linux-x64-ANY.tar.gz".format(HomeDir_Name=HomeDir_Name)))\n # Get a handier name.... \n run("rm -rf vnc")\n run(("mv /home/{HomeDir_Name}/VNC-5.2.3-Linux-x64 /home/{HomeDir_Name}/vnc".format(HomeDir_Name=HomeDir_Name)))\n sudo(("/home/{HomeDir_Name}/vnc/vnclicense -add {VncLicense}".format(\n HomeDir_Name= HomeDir_Name,\n VncLicense = VNC_LICENSE[0]\n )))\n # Will demand some for of interactive input...\n run(("mkdir -p /home/{HomeDir_Name}/.vnc/".format(HomeDir_Name=HomeDir_Name)))\n run(("mkdir -p /home/{HomeDir_Name}/.vnc/config.d/".format(HomeDir_Name=HomeDir_Name)))\n sudo(("/home/{HomeDir_Name}/vnc/vncpasswd /home/{HomeDir_Name}/.vnc/config.d/Xvnc".format(HomeDir_Name=HomeDir_Name)))\n vnc_fix_permissions()\n\n@hosts(StationA_H, StationB_H)\ndef vnc_fix_permissions():\n sudo(("chown {HomeDir_Name} /home/{HomeDir_Name}/.vnc -R").format(HomeDir_Name=HomeDir_Name))\n\n@hosts(StationA_H, StationB_H)\ndef install_vnc_xstartup():\n run(("mkdir -p /home/{HomeDir_Name}/.vnc/".format(HomeDir_Name=HomeDir_Name)))\n run(("mkdir -p /home/{HomeDir_Name}/.vnc/config.d/".format(HomeDir_Name=HomeDir_Name)))\n put(\n local_path = "scripts/vnc-xstartup",\n remote_path = ("/home/{HomeDir_Name}/.vnc/xstartup".format(HomeDir_Name=HomeDir_Name))\n )\n run("chmod ugo+x /home/{HomeDir_Name}/.vnc/xstartup".format(HomeDir_Name=HomeDir_Name))\n put(\n local_path = "scripts/xvncfontpath",\n remote_path = ("/home/{HomeDir_Name}/.vnc/config.d/xvncfontpath".format(HomeDir_Name=HomeDir_Name))\n )\n\n\n@hosts(StationA_H, StationB_H)\ndef setup_google_chrome():\n put(\n local_path = "scripts/google-chrome-stable_current_amd64.deb",\n remote_path = ("/home/{HomeDir_Name}/google-chrome-stable_current_amd64.deb".format(HomeDir_Name=HomeDir_Name)),\n use_sudo = True\n )\n really_setup_google_chrome()\n\n\n@hosts(Beefy_H, StationA_H, StationB_H)\ndef ensure_local_hosts():\n # Get the contents of /etc/hosts\n local_file = StringIO()\n get(\n local_path = local_file,\n remote_path = "/etc/hosts",\n use_sudo = True \n )\n hosts_file = local_file.getvalue()\n snippet = """# DO NOT EDIT BELOW BY HAND\n{Beefy_InternalIP} instr.httpdos.com\n192.168.112.129 ip-192-168-112-129\n192.168.112.130 ip-192-168-112-130\n192.168.112.131 ip-192-168-112-131\n# END DO NOT EDIT BELOW""".format(\n StationA_InternalIP = StationA_InternalIP,\n Beefy_InternalIP = Beefy_InternalIP\n )\n mo = re.search(r"# DO NOT EDIT BELOW BY HAND\\n(.*?)\\n# END DO NOT EDIT BELOW", hosts_file, re.MULTILINE)\n if mo:\n part_before = hosts_file[:mo.start(0)]\n part_after = hosts_file[mo.end(0):]\n hosts_file = part_before + snippet + part_after\n else:\n hosts_file += "\\n" + snippet\n\n put(\n local_path = StringIO(hosts_file),\n remote_path = "/etc/hosts",\n use_sudo = True\n )\n\n\n@hosts(StationA_H, StationB_H)\ndef really_setup_google_chrome():\n sudo("apt-get update")\n sudo(("apt-get -f install -y".format(HomeDir_Name=HomeDir_Name)))\n sudo("apt-get install -y --fix-missing xdg-utils")\n sudo(("dpkg -i --force-depends /home/{HomeDir_Name}/google-chrome-stable_current_amd64.deb".format(HomeDir_Name=HomeDir_Name)))\n sudo(("apt-get -f install -y".format(HomeDir_Name=HomeDir_Name)))\n\n\n@hosts(StationA_H, StationB_H)\ndef setup_vnc_service():\n put(\n local_path = "scripts/vncserv-{HomeDir_Name}.conf".format(HomeDir_Name=HomeDir_Name),\n remote_path = "/etc/init/vncserv.conf",\n use_sudo = True\n )\n put(\n local_path = "scripts/undaemon.py",\n remote_path = "/home/{HomeDir_Name}/undaemon.py".format(HomeDir_Name=HomeDir_Name)\n )\n run("chmod ugo+x /home/{HomeDir_Name}/undaemon.py".format(HomeDir_Name=HomeDir_Name))\n with settings(warn_only=True):\n sudo(\n "service vncserv start"\n )\n\n\n@hosts(StationA_H, StationB_H)\ndef disable_lightdm():\n contents = StringIO("manual")\n put(\n local_path = contents, \n remote_path = "/etc/init/lightdm.override",\n use_sudo=True\n )\n\n\n@hosts(StationA_H, StationB_H)\ndef touch_xauthority():\n run("touch $HOME/.Xauthority")\n\n\n@hosts(StationA_H, StationB_H)\ndef deploy():\n execute(apt_stations)\n execute(setup_dns_masq)\n execute(setup_google_chrome)\n execute(deploy_specific)\n execute(touch_xauthority)\n execute(disable_lightdm)\n execute(StationA)\n execute(StationB)\n execute(Beefy)\n execute(ca)\n execute(ssl)\n execute(install_vnc)\n execute(install_vnc_xstartup)\n execute(ensure_local_hosts)\n execute(setup_vnc_service)\n execute(pythonlibs)\n execute(BeefyRehMimic)\n execute(install_updatednsmasq_service) \n```\n\nThis code uses Fabric to copy the code to StationA and StationB. The `StationA` function copies the code to StationA, while the `StationB` function copies the code to StationB. The `setup_dns_masq` function installs and sets up dnsmasq on StationB. The `deploy_specific` function deploys the code to the appropriate station based on the host string. The `apt_stations` function installs necessary packages on both StationA and StationB. The `pythonlibs` function installs required python libraries. The `ssl` function copies the OpenSSL and curl files to the target hosts. The `ca` function copies the ca certificate to the home directory. The `install_vnc` function installs VNC on both StationA and StationB. The `install_vnc_xstartup` function sets up the VNC xstartup file. The `setup_google_chrome` function installs Google Chrome on both StationA and StationB. The `ensure_local_hosts` function updates the /etc/hosts file with the appropriate IP addresses. The `really_setup_google_chrome` function completes the setup of Google Chrome. The `setup_vnc_service` function sets up the VNC service. The `disable_lightdm` function disables lightdm. The `touch_xauthority` function creates the .Xauthority file. The `deploy` function executes all the necessary steps to deploy the code to both StationA and StationB.',
"role": "assistant",
},
],
"average_response_length": 13327.0,
}
```
Here, `source` and `category` refer to metadata present in the original OpenHermes-2.5 dataset, while `messages` consists of multi-turn conversations that can be wrapped in a chat template like ChatML for training. The `average_response_length` denotes the average length of the assistant conversations, measured in characters.
## Dataset Creation
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
See the [`create_dataset.py`](https://huggingface.co/datasets/HuggingFaceH4/OpenHermes-2.5-1k-longest/blob/main/create_dataset.py) script for details on how the dataset was constructed.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
This dataset was derived from Teknium's high-quality [OpenHermes-2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5) dataset that mostly comprises of GPT-4 instructions and demonstrations. | HuggingFaceH4/OpenHermes-2.5-1k-longest | [
"task_categories:text-generation",
"license:other",
"sft",
"synthetic",
"arxiv:2402.04833",
"region:us"
] | 2024-02-16T14:18:39+00:00 | {"license": "other", "task_categories": ["text-generation"], "pretty_name": "OpenHermes-2.5-1k-longest", "dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "average_response_length", "dtype": "float64"}], "splits": [{"name": "train_sft", "num_bytes": 6190190, "num_examples": 1000}, {"name": "test_sft", "num_bytes": 8730167, "num_examples": 1000}], "download_size": 5949801, "dataset_size": 14920357}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "test_sft", "path": "data/test_sft-*"}]}], "tags": ["sft", "synthetic"]} | 2024-02-16T16:32:00+00:00 |
15935b50afb80ba03c864805975708e8f0d64b77 | alirzb/SeizureClassifier_Wav2Vec_43243531_on_Bal_43827959 | [
"region:us"
] | 2024-02-16T14:19:36+00:00 | {"dataset_info": {"features": [{"name": "array", "sequence": "float64"}, {"name": "label_true", "dtype": "int64"}, {"name": "label_pred", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 164995272, "num_examples": 402}], "download_size": 89805200, "dataset_size": 164995272}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T14:19:52+00:00 |
|
a417d8f1c8c96e81b7f3dd6f403c3acca5b3671e | sajib09/badd_data_bd | [
"region:us"
] | 2024-02-16T14:20:22+00:00 | {} | 2024-02-16T14:20:22+00:00 |
|
8dfa09141b69db5e86914d00c76cff40c91663bb | OALL/Arabic_MMLU_1 | [
"region:us"
] | 2024-02-16T14:21:23+00:00 | {"dataset_info": [{"config_name": "abstract_algebra", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 27869, "num_examples": 100}, {"name": "dev", "num_bytes": 1174, "num_examples": 5}], "download_size": 20316, "dataset_size": 29043}, {"config_name": "anatomy", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 47319, "num_examples": 135}, {"name": "dev", "num_bytes": 1484, "num_examples": 5}], "download_size": 36141, "dataset_size": 48803}, {"config_name": "astronomy", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 67728, "num_examples": 152}, {"name": "dev", "num_bytes": 2916, "num_examples": 5}], "download_size": 50803, "dataset_size": 70644}, {"config_name": "business_ethics", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 48814, "num_examples": 100}, {"name": "dev", "num_bytes": 3153, "num_examples": 5}], "download_size": 38162, "dataset_size": 51967}, {"config_name": "clinical_knowledge", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 95191, "num_examples": 265}, {"name": "dev", "num_bytes": 1675, "num_examples": 5}], "download_size": 64421, "dataset_size": 96866}, {"config_name": "college_biology", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 70399, "num_examples": 144}, {"name": "dev", "num_bytes": 2219, "num_examples": 5}], "download_size": 50813, "dataset_size": 72618}, {"config_name": "college_chemistry", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 34376, "num_examples": 100}, {"name": "dev", "num_bytes": 1938, "num_examples": 5}], "download_size": 32376, "dataset_size": 36314}, {"config_name": "college_computer_science", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 53379, "num_examples": 100}, {"name": "dev", "num_bytes": 3235, "num_examples": 5}], "download_size": 41637, "dataset_size": 56614}, {"config_name": "college_mathematics", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 33548, "num_examples": 100}, {"name": "dev", "num_bytes": 1736, "num_examples": 5}], "download_size": 30243, "dataset_size": 35284}, {"config_name": "college_medicine", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 76876, "num_examples": 173}, {"name": "dev", "num_bytes": 2555, "num_examples": 5}], "download_size": 54812, "dataset_size": 79431}, {"config_name": "college_physics", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 39473, "num_examples": 102}, {"name": "dev", "num_bytes": 1683, "num_examples": 5}], "download_size": 30732, "dataset_size": 41156}, {"config_name": "computer_security", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 37466, "num_examples": 100}, {"name": "dev", "num_bytes": 1665, "num_examples": 5}], "download_size": 31753, "dataset_size": 39131}, {"config_name": "conceptual_physics", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 61245, "num_examples": 235}, {"name": "dev", "num_bytes": 1372, "num_examples": 5}], "download_size": 41491, "dataset_size": 62617}, {"config_name": "econometrics", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 60901, "num_examples": 114}, {"name": "dev", "num_bytes": 2229, "num_examples": 5}], "download_size": 44978, "dataset_size": 63130}, {"config_name": "electrical_engineering", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 37987, "num_examples": 145}, {"name": "dev", "num_bytes": 1525, "num_examples": 5}], "download_size": 32424, "dataset_size": 39512}, {"config_name": "elementary_mathematics", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 95373, "num_examples": 378}, {"name": "dev", "num_bytes": 1903, "num_examples": 5}], "download_size": 63782, "dataset_size": 97276}, {"config_name": "formal_logic", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 66408, "num_examples": 126}, {"name": "dev", "num_bytes": 2453, "num_examples": 5}], "download_size": 44236, "dataset_size": 68861}, {"config_name": "global_facts", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 28611, "num_examples": 100}, {"name": "dev", "num_bytes": 1657, "num_examples": 5}], "download_size": 27347, "dataset_size": 30268}, {"config_name": "high_school_biology", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 152224, "num_examples": 310}, {"name": "dev", "num_bytes": 2595, "num_examples": 5}], "download_size": 91899, "dataset_size": 154819}, {"config_name": "high_school_chemistry", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 79628, "num_examples": 203}, {"name": "dev", "num_bytes": 1900, "num_examples": 5}], "download_size": 52835, "dataset_size": 81528}, {"config_name": "high_school_computer_science", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 56439, "num_examples": 100}, {"name": "dev", "num_bytes": 4078, "num_examples": 5}], "download_size": 46992, "dataset_size": 60517}, {"config_name": "high_school_european_history", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 86218, "num_examples": 165}, {"name": "dev", "num_bytes": 2154, "num_examples": 5}], "download_size": 49553, "dataset_size": 88372}, {"config_name": "high_school_geography", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 62329, "num_examples": 198}, {"name": "dev", "num_bytes": 1929, "num_examples": 5}], "download_size": 45381, "dataset_size": 64258}, {"config_name": "high_school_government_and_politics", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 95421, "num_examples": 193}, {"name": "dev", "num_bytes": 2454, "num_examples": 5}], "download_size": 63720, "dataset_size": 97875}, {"config_name": "high_school_macroeconomics", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 173550, "num_examples": 390}, {"name": "dev", "num_bytes": 2181, "num_examples": 5}], "download_size": 92592, "dataset_size": 175731}, {"config_name": "high_school_mathematics", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 69570, "num_examples": 270}, {"name": "dev", "num_bytes": 1817, "num_examples": 5}], "download_size": 47591, "dataset_size": 71387}, {"config_name": "high_school_microeconomics", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 110429, "num_examples": 238}, {"name": "dev", "num_bytes": 1642, "num_examples": 5}], "download_size": 65501, "dataset_size": 112071}, {"config_name": "high_school_physics", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 70933, "num_examples": 151}, {"name": "dev", "num_bytes": 1976, "num_examples": 5}], "download_size": 46812, "dataset_size": 72909}, {"config_name": "high_school_psychology", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 219085, "num_examples": 545}, {"name": "dev", "num_bytes": 2251, "num_examples": 5}], "download_size": 124586, "dataset_size": 221336}, {"config_name": "high_school_statistics", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 138484, "num_examples": 216}, {"name": "dev", "num_bytes": 2818, "num_examples": 5}], "download_size": 84140, "dataset_size": 141302}, {"config_name": "high_school_us_history", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 102628, "num_examples": 204}, {"name": "dev", "num_bytes": 1978, "num_examples": 5}], "download_size": 45926, "dataset_size": 104606}, {"config_name": "high_school_world_history", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 125016, "num_examples": 237}, {"name": "dev", "num_bytes": 1990, "num_examples": 5}], "download_size": 57934, "dataset_size": 127006}, {"config_name": "human_aging", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 69092, "num_examples": 223}, {"name": "dev", "num_bytes": 1505, "num_examples": 5}], "download_size": 49267, "dataset_size": 70597}, {"config_name": "human_sexuality", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 45634, "num_examples": 131}, {"name": "dev", "num_bytes": 1560, "num_examples": 5}], "download_size": 36633, "dataset_size": 47194}, {"config_name": "international_law", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 76896, "num_examples": 121}, {"name": "dev", "num_bytes": 3511, "num_examples": 5}], "download_size": 59264, "dataset_size": 80407}, {"config_name": "jurisprudence", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 46173, "num_examples": 108}, {"name": "dev", "num_bytes": 1675, "num_examples": 5}], "download_size": 39590, "dataset_size": 47848}, {"config_name": "logical_fallacies", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 67949, "num_examples": 163}, {"name": "dev", "num_bytes": 2070, "num_examples": 5}], "download_size": 46223, "dataset_size": 70019}, {"config_name": "machine_learning", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 47102, "num_examples": 112}, {"name": "dev", "num_bytes": 3348, "num_examples": 5}], "download_size": 40975, "dataset_size": 50450}, {"config_name": "management", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 28323, "num_examples": 103}, {"name": "dev", "num_bytes": 1212, "num_examples": 5}], "download_size": 26409, "dataset_size": 29535}, {"config_name": "marketing", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 90037, "num_examples": 234}, {"name": "dev", "num_bytes": 2437, "num_examples": 5}], "download_size": 59485, "dataset_size": 92474}, {"config_name": "medical_genetics", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 31333, "num_examples": 100}, {"name": "dev", "num_bytes": 1917, "num_examples": 5}], "download_size": 30818, "dataset_size": 33250}, {"config_name": "miscellaneous", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 209374, "num_examples": 783}, {"name": "dev", "num_bytes": 1079, "num_examples": 5}], "download_size": 128245, "dataset_size": 210453}, {"config_name": "moral_disputes", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 152598, "num_examples": 346}, {"name": "dev", "num_bytes": 2444, "num_examples": 5}], "download_size": 94470, "dataset_size": 155042}, {"config_name": "moral_scenarios", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 598093, "num_examples": 895}, {"name": "dev", "num_bytes": 3177, "num_examples": 5}], "download_size": 171872, "dataset_size": 601270}, {"config_name": "nutrition", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 140802, "num_examples": 306}, {"name": "dev", "num_bytes": 3167, "num_examples": 5}], "download_size": 88772, "dataset_size": 143969}, {"config_name": "philosophy", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 109824, "num_examples": 311}, {"name": "dev", "num_bytes": 1325, "num_examples": 5}], "download_size": 69547, "dataset_size": 111149}, {"config_name": "prehistory", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 129046, "num_examples": 324}, {"name": "dev", "num_bytes": 2730, "num_examples": 5}], "download_size": 84385, "dataset_size": 131776}, {"config_name": "professional_accounting", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 170502, "num_examples": 282}, {"name": "dev", "num_bytes": 2630, "num_examples": 5}], "download_size": 100246, "dataset_size": 173132}, {"config_name": "professional_law", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 1736111, "num_examples": 1534}, {"name": "dev", "num_bytes": 6811, "num_examples": 5}], "download_size": 841921, "dataset_size": 1742922}, {"config_name": "professional_medicine", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 161834, "num_examples": 272}, {"name": "dev", "num_bytes": 2879, "num_examples": 5}], "download_size": 80197, "dataset_size": 164713}, {"config_name": "professional_psychology", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 298038, "num_examples": 612}, {"name": "dev", "num_bytes": 3053, "num_examples": 5}], "download_size": 170209, "dataset_size": 301091}, {"config_name": "public_relations", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 39425, "num_examples": 110}, {"name": "dev", "num_bytes": 2141, "num_examples": 5}], "download_size": 34630, "dataset_size": 41566}, {"config_name": "security_studies", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 284253, "num_examples": 245}, {"name": "dev", "num_bytes": 7095, "num_examples": 5}], "download_size": 163448, "dataset_size": 291348}, {"config_name": "sociology", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 94143, "num_examples": 201}, {"name": "dev", "num_bytes": 2395, "num_examples": 5}], "download_size": 63829, "dataset_size": 96538}, {"config_name": "us_foreign_policy", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 42025, "num_examples": 100}, {"name": "dev", "num_bytes": 2450, "num_examples": 5}], "download_size": 35556, "dataset_size": 44475}, {"config_name": "virology", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 59891, "num_examples": 166}, {"name": "dev", "num_bytes": 1672, "num_examples": 5}], "download_size": 43367, "dataset_size": 61563}, {"config_name": "world_religions", "features": [{"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 35476, "num_examples": 171}, {"name": "dev", "num_bytes": 855, "num_examples": 5}], "download_size": 29676, "dataset_size": 36331}], "configs": [{"config_name": "abstract_algebra", "data_files": [{"split": "test", "path": "abstract_algebra/test-*"}, {"split": "dev", "path": "abstract_algebra/dev-*"}]}, {"config_name": "anatomy", "data_files": [{"split": "test", "path": "anatomy/test-*"}, {"split": "dev", "path": "anatomy/dev-*"}]}, {"config_name": "astronomy", "data_files": [{"split": "test", "path": "astronomy/test-*"}, {"split": "dev", "path": "astronomy/dev-*"}]}, {"config_name": "business_ethics", "data_files": [{"split": "test", "path": "business_ethics/test-*"}, {"split": "dev", "path": "business_ethics/dev-*"}]}, {"config_name": "clinical_knowledge", "data_files": [{"split": "test", "path": "clinical_knowledge/test-*"}, {"split": "dev", "path": "clinical_knowledge/dev-*"}]}, {"config_name": "college_biology", "data_files": [{"split": "test", "path": "college_biology/test-*"}, {"split": "dev", "path": "college_biology/dev-*"}]}, {"config_name": "college_chemistry", "data_files": [{"split": "test", "path": "college_chemistry/test-*"}, {"split": "dev", "path": "college_chemistry/dev-*"}]}, {"config_name": "college_computer_science", "data_files": [{"split": "test", "path": "college_computer_science/test-*"}, {"split": "dev", "path": "college_computer_science/dev-*"}]}, {"config_name": "college_mathematics", "data_files": [{"split": "test", "path": "college_mathematics/test-*"}, {"split": "dev", "path": "college_mathematics/dev-*"}]}, {"config_name": "college_medicine", "data_files": [{"split": "test", "path": "college_medicine/test-*"}, {"split": "dev", "path": "college_medicine/dev-*"}]}, {"config_name": "college_physics", "data_files": [{"split": "test", "path": "college_physics/test-*"}, {"split": "dev", "path": "college_physics/dev-*"}]}, {"config_name": "computer_security", "data_files": [{"split": "test", "path": "computer_security/test-*"}, {"split": "dev", "path": "computer_security/dev-*"}]}, {"config_name": "conceptual_physics", "data_files": [{"split": "test", "path": "conceptual_physics/test-*"}, {"split": "dev", "path": "conceptual_physics/dev-*"}]}, {"config_name": "econometrics", "data_files": [{"split": "test", "path": "econometrics/test-*"}, {"split": "dev", "path": "econometrics/dev-*"}]}, {"config_name": "electrical_engineering", "data_files": [{"split": "test", "path": "electrical_engineering/test-*"}, {"split": "dev", "path": "electrical_engineering/dev-*"}]}, {"config_name": "elementary_mathematics", "data_files": [{"split": "test", "path": "elementary_mathematics/test-*"}, {"split": "dev", "path": "elementary_mathematics/dev-*"}]}, {"config_name": "formal_logic", "data_files": [{"split": "test", "path": "formal_logic/test-*"}, {"split": "dev", "path": "formal_logic/dev-*"}]}, {"config_name": "global_facts", "data_files": [{"split": "test", "path": "global_facts/test-*"}, {"split": "dev", "path": "global_facts/dev-*"}]}, {"config_name": "high_school_biology", "data_files": [{"split": "test", "path": "high_school_biology/test-*"}, {"split": "dev", "path": "high_school_biology/dev-*"}]}, {"config_name": "high_school_chemistry", "data_files": [{"split": "test", "path": "high_school_chemistry/test-*"}, {"split": "dev", "path": "high_school_chemistry/dev-*"}]}, {"config_name": "high_school_computer_science", "data_files": [{"split": "test", "path": "high_school_computer_science/test-*"}, {"split": "dev", "path": "high_school_computer_science/dev-*"}]}, {"config_name": "high_school_european_history", "data_files": [{"split": "test", "path": "high_school_european_history/test-*"}, {"split": "dev", "path": "high_school_european_history/dev-*"}]}, {"config_name": "high_school_geography", "data_files": [{"split": "test", "path": "high_school_geography/test-*"}, {"split": "dev", "path": "high_school_geography/dev-*"}]}, {"config_name": "high_school_government_and_politics", "data_files": [{"split": "test", "path": "high_school_government_and_politics/test-*"}, {"split": "dev", "path": "high_school_government_and_politics/dev-*"}]}, {"config_name": "high_school_macroeconomics", "data_files": [{"split": "test", "path": "high_school_macroeconomics/test-*"}, {"split": "dev", "path": "high_school_macroeconomics/dev-*"}]}, {"config_name": "high_school_mathematics", "data_files": [{"split": "test", "path": "high_school_mathematics/test-*"}, {"split": "dev", "path": "high_school_mathematics/dev-*"}]}, {"config_name": "high_school_microeconomics", "data_files": [{"split": "test", "path": "high_school_microeconomics/test-*"}, {"split": "dev", "path": "high_school_microeconomics/dev-*"}]}, {"config_name": "high_school_physics", "data_files": [{"split": "test", "path": "high_school_physics/test-*"}, {"split": "dev", "path": "high_school_physics/dev-*"}]}, {"config_name": "high_school_psychology", "data_files": [{"split": "test", "path": "high_school_psychology/test-*"}, {"split": "dev", "path": "high_school_psychology/dev-*"}]}, {"config_name": "high_school_statistics", "data_files": [{"split": "test", "path": "high_school_statistics/test-*"}, {"split": "dev", "path": "high_school_statistics/dev-*"}]}, {"config_name": "high_school_us_history", "data_files": [{"split": "test", "path": "high_school_us_history/test-*"}, {"split": "dev", "path": "high_school_us_history/dev-*"}]}, {"config_name": "high_school_world_history", "data_files": [{"split": "test", "path": "high_school_world_history/test-*"}, {"split": "dev", "path": "high_school_world_history/dev-*"}]}, {"config_name": "human_aging", "data_files": [{"split": "test", "path": "human_aging/test-*"}, {"split": "dev", "path": "human_aging/dev-*"}]}, {"config_name": "human_sexuality", "data_files": [{"split": "test", "path": "human_sexuality/test-*"}, {"split": "dev", "path": "human_sexuality/dev-*"}]}, {"config_name": "international_law", "data_files": [{"split": "test", "path": "international_law/test-*"}, {"split": "dev", "path": "international_law/dev-*"}]}, {"config_name": "jurisprudence", "data_files": [{"split": "test", "path": "jurisprudence/test-*"}, {"split": "dev", "path": "jurisprudence/dev-*"}]}, {"config_name": "logical_fallacies", "data_files": [{"split": "test", "path": "logical_fallacies/test-*"}, {"split": "dev", "path": "logical_fallacies/dev-*"}]}, {"config_name": "machine_learning", "data_files": [{"split": "test", "path": "machine_learning/test-*"}, {"split": "dev", "path": "machine_learning/dev-*"}]}, {"config_name": "management", "data_files": [{"split": "test", "path": "management/test-*"}, {"split": "dev", "path": "management/dev-*"}]}, {"config_name": "marketing", "data_files": [{"split": "test", "path": "marketing/test-*"}, {"split": "dev", "path": "marketing/dev-*"}]}, {"config_name": "medical_genetics", "data_files": [{"split": "test", "path": "medical_genetics/test-*"}, {"split": "dev", "path": "medical_genetics/dev-*"}]}, {"config_name": "miscellaneous", "data_files": [{"split": "test", "path": "miscellaneous/test-*"}, {"split": "dev", "path": "miscellaneous/dev-*"}]}, {"config_name": "moral_disputes", "data_files": [{"split": "test", "path": "moral_disputes/test-*"}, {"split": "dev", "path": "moral_disputes/dev-*"}]}, {"config_name": "moral_scenarios", "data_files": [{"split": "test", "path": "moral_scenarios/test-*"}, {"split": "dev", "path": "moral_scenarios/dev-*"}]}, {"config_name": "nutrition", "data_files": [{"split": "test", "path": "nutrition/test-*"}, {"split": "dev", "path": "nutrition/dev-*"}]}, {"config_name": "philosophy", "data_files": [{"split": "test", "path": "philosophy/test-*"}, {"split": "dev", "path": "philosophy/dev-*"}]}, {"config_name": "prehistory", "data_files": [{"split": "test", "path": "prehistory/test-*"}, {"split": "dev", "path": "prehistory/dev-*"}]}, {"config_name": "professional_accounting", "data_files": [{"split": "test", "path": "professional_accounting/test-*"}, {"split": "dev", "path": "professional_accounting/dev-*"}]}, {"config_name": "professional_law", "data_files": [{"split": "test", "path": "professional_law/test-*"}, {"split": "dev", "path": "professional_law/dev-*"}]}, {"config_name": "professional_medicine", "data_files": [{"split": "test", "path": "professional_medicine/test-*"}, {"split": "dev", "path": "professional_medicine/dev-*"}]}, {"config_name": "professional_psychology", "data_files": [{"split": "test", "path": "professional_psychology/test-*"}, {"split": "dev", "path": "professional_psychology/dev-*"}]}, {"config_name": "public_relations", "data_files": [{"split": "test", "path": "public_relations/test-*"}, {"split": "dev", "path": "public_relations/dev-*"}]}, {"config_name": "security_studies", "data_files": [{"split": "test", "path": "security_studies/test-*"}, {"split": "dev", "path": "security_studies/dev-*"}]}, {"config_name": "sociology", "data_files": [{"split": "test", "path": "sociology/test-*"}, {"split": "dev", "path": "sociology/dev-*"}]}, {"config_name": "us_foreign_policy", "data_files": [{"split": "test", "path": "us_foreign_policy/test-*"}, {"split": "dev", "path": "us_foreign_policy/dev-*"}]}, {"config_name": "virology", "data_files": [{"split": "test", "path": "virology/test-*"}, {"split": "dev", "path": "virology/dev-*"}]}, {"config_name": "world_religions", "data_files": [{"split": "test", "path": "world_religions/test-*"}, {"split": "dev", "path": "world_religions/dev-*"}]}]} | 2024-02-16T14:27:31+00:00 |
|
d15b9e4b417296f3c29757bbd4592f9f7d966edf | maghwa/OpenHermes-2-AR-10K-22-640k-650k | [
"region:us"
] | 2024-02-16T14:29:23+00:00 | {"dataset_info": {"features": [{"name": "skip_prompt_formatting", "dtype": "null"}, {"name": "model_name", "dtype": "null"}, {"name": "model", "dtype": "null"}, {"name": "conversations", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "id", "dtype": "null"}, {"name": "avatarUrl", "dtype": "null"}, {"name": "idx", "dtype": "null"}, {"name": "language", "dtype": "null"}, {"name": "hash", "dtype": "null"}, {"name": "views", "dtype": "float64"}, {"name": "topic", "dtype": "null"}, {"name": "title", "dtype": "null"}, {"name": "category", "dtype": "null"}, {"name": "custom_instruction", "dtype": "null"}, {"name": "system_prompt", "dtype": "null"}], "splits": [{"name": "train", "num_bytes": 25425327, "num_examples": 10001}], "download_size": 11523686, "dataset_size": 25425327}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T14:29:26+00:00 |
|
43cfa11d9cec6707e807cb7c1718e07917558f18 | Recag/Rp_CommonC_722_1 | [
"region:us"
] | 2024-02-16T14:32:47+00:00 | {} | 2024-02-16T14:40:37+00:00 |
|
ca77014e708a93876b37356310455cf29584f537 | Recag/Rp_CommonC_722_2 | [
"region:us"
] | 2024-02-16T14:32:59+00:00 | {} | 2024-02-16T14:41:43+00:00 |
|
ce7f03031ce50ddbf9f747da42a59ef1f96d1dc2 | Recag/Rp_CommonC_722_3 | [
"region:us"
] | 2024-02-16T14:33:09+00:00 | {} | 2024-02-16T14:43:45+00:00 |
|
1749664c8cde379d732293a46292eff2fbc40d73 | Recag/Rp_CommonC_723_1 | [
"region:us"
] | 2024-02-16T14:33:22+00:00 | {} | 2024-02-16T14:47:06+00:00 |
|
1d2edfae5d65d6df935f0b627518cda25cb6218e | Recag/Rp_CommonC_723_2 | [
"region:us"
] | 2024-02-16T14:33:32+00:00 | {} | 2024-02-16T14:49:09+00:00 |
|
f3ec597ba0f591d49b72a6959badb8ded3e0bee5 | Recag/Rp_CommonC_723_3 | [
"region:us"
] | 2024-02-16T14:33:45+00:00 | {} | 2024-02-16T14:49:19+00:00 |
|
f462753c92e24140a196800fbd9e01c82d31b3b1 | Recag/Rp_CommonC_724_1 | [
"region:us"
] | 2024-02-16T14:33:56+00:00 | {} | 2024-02-16T15:10:46+00:00 |
|
aeeb6d9896a4c07eae6c2c336c87fd2016a2a5ca | Recag/Rp_CommonC_724_2 | [
"region:us"
] | 2024-02-16T14:34:07+00:00 | {} | 2024-02-16T15:13:57+00:00 |
|
6460bb92367d557b998c8fa3354adc661b2596b7 | Recag/Rp_CommonC_724_3 | [
"region:us"
] | 2024-02-16T14:34:18+00:00 | {} | 2024-02-16T15:12:43+00:00 |
|
32c2eb3da242e9dec0d60312467b091aee2e42c0 | Recag/Rp_CommonC_725_1 | [
"region:us"
] | 2024-02-16T14:34:30+00:00 | {} | 2024-02-16T15:13:58+00:00 |
|
afc81a824a953ab4b5124c577dc2d23c8c4e8322 | Recag/Rp_CommonC_725_2 | [
"region:us"
] | 2024-02-16T14:34:40+00:00 | {} | 2024-02-16T15:20:05+00:00 |
|
286dbbda26a21db51793a648f877b3f76841257e | Recag/Rp_CommonC_725_3 | [
"region:us"
] | 2024-02-16T14:34:51+00:00 | {} | 2024-02-16T15:20:14+00:00 |
|
6e08c7f1acf0da89ee26e5c380117532778319a1 | benayas/banking_augmented_5pct_v2 | [
"region:us"
] | 2024-02-16T14:40:05+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 982661, "num_examples": 10003}], "download_size": 381906, "dataset_size": 982661}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T14:40:09+00:00 |
|
efab93a6b679edb018757c95a9a5d631af86680d | khalilmas9/Fashion_brands | [
"region:us"
] | 2024-02-16T14:41:30+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1578860.0, "num_examples": 10}], "download_size": 1563176, "dataset_size": 1578860.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T15:45:22+00:00 |
|
8e0b2e26c91e1d34b772313eab9840eae9442a9e | ElderlyDed/CheckPhoto5k | [
"region:us"
] | 2024-02-16T14:41:51+00:00 | {} | 2024-02-16T14:42:00+00:00 |
|
dc527cd4eabf6c42bd5fc03680d4bcb7ed1291f6 | filipetrinidad/teste_open | [
"region:us"
] | 2024-02-16T14:43:36+00:00 | {} | 2024-02-16T14:43:51+00:00 |
|
7f51952bd89a96dbc983c7dfd5544eb6e9b8458b | lingtrain/udmurt-russian | [
"region:us"
] | 2024-02-16T14:43:39+00:00 | {"dataset_info": {"features": [{"name": "ru", "dtype": "string"}, {"name": "udm", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 14163182, "num_examples": 36150}], "download_size": 6509428, "dataset_size": 14163182}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T14:43:41+00:00 |
|
5110bd9bd15658f5ddf52bf4db9a418ed5a50ddf | filipetrinidad/image_i | [
"region:us"
] | 2024-02-16T14:47:45+00:00 | {} | 2024-02-16T14:48:03+00:00 |
|
dc9f8d00ec3fffaa8af578225e2ffdca23d37a2a |
# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian2.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RaduGabriel/SirUkrainian2.0](https://huggingface.co/RaduGabriel/SirUkrainian2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T14:47:08.297350](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0/blob/main/results_2024-02-16T14-47-08.297350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6122899617068881,
"acc_stderr": 0.03314256377542638,
"acc_norm": 0.6163125160011517,
"acc_norm_stderr": 0.033822587397925895,
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104,
"mc2": 0.6423733209082649,
"mc2_stderr": 0.01507454376325255
},
"harness|arc:challenge|25": {
"acc": 0.5989761092150171,
"acc_stderr": 0.014322255790719865,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068283
},
"harness|hellaswag|10": {
"acc": 0.650866361282613,
"acc_stderr": 0.004757220449283699,
"acc_norm": 0.832603067118104,
"acc_norm_stderr": 0.0037256689970413094
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.02556060472102289,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.02556060472102289
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.02486499515976775,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.02486499515976775
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.029443169323031537,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.029443169323031537
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.01738141556360868,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.01738141556360868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693257,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823297,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823297
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4782122905027933,
"acc_stderr": 0.016706617522176132,
"acc_norm": 0.4782122905027933,
"acc_norm_stderr": 0.016706617522176132
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.02685882587948855,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.02685882587948855
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001855,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001855
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.01268201633564667,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.01268201633564667
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.029722152099280065,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.029722152099280065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.01969145905235403,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.01969145905235403
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104,
"mc2": 0.6423733209082649,
"mc2_stderr": 0.01507454376325255
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626918
},
"harness|gsm8k|5": {
"acc": 0.41015921152388174,
"acc_stderr": 0.013548335117860338
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0 | [
"region:us"
] | 2024-02-16T14:49:26+00:00 | {"pretty_name": "Evaluation run of RaduGabriel/SirUkrainian2.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [RaduGabriel/SirUkrainian2.0](https://huggingface.co/RaduGabriel/SirUkrainian2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T14:47:08.297350](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0/blob/main/results_2024-02-16T14-47-08.297350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6122899617068881,\n \"acc_stderr\": 0.03314256377542638,\n \"acc_norm\": 0.6163125160011517,\n \"acc_norm_stderr\": 0.033822587397925895,\n \"mc1\": 0.4749082007343941,\n \"mc1_stderr\": 0.017481446804104,\n \"mc2\": 0.6423733209082649,\n \"mc2_stderr\": 0.01507454376325255\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5989761092150171,\n \"acc_stderr\": 0.014322255790719865,\n \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068283\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.650866361282613,\n \"acc_stderr\": 0.004757220449283699,\n \"acc_norm\": 0.832603067118104,\n \"acc_norm_stderr\": 0.0037256689970413094\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644826,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644826\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n \"acc_stderr\": 0.02556060472102289,\n \"acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.02556060472102289\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.02486499515976775,\n \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.02486499515976775\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7926605504587156,\n \"acc_stderr\": 0.01738141556360868,\n \"acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.01738141556360868\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693257,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693257\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n \"acc_stderr\": 0.014743125394823297,\n \"acc_norm\": 0.7828863346104725,\n \"acc_norm_stderr\": 0.014743125394823297\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4782122905027933,\n \"acc_stderr\": 0.016706617522176132,\n \"acc_norm\": 0.4782122905027933,\n \"acc_norm_stderr\": 0.016706617522176132\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n \"acc_stderr\": 0.02685882587948855,\n \"acc_norm\": 0.662379421221865,\n \"acc_norm_stderr\": 0.02685882587948855\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001855,\n \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001855\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n \"acc_stderr\": 0.01268201633564667,\n \"acc_norm\": 0.44132985658409385,\n \"acc_norm_stderr\": 0.01268201633564667\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.01969145905235403,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.01969145905235403\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4749082007343941,\n \"mc1_stderr\": 0.017481446804104,\n \"mc2\": 0.6423733209082649,\n \"mc2_stderr\": 0.01507454376325255\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626918\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41015921152388174,\n \"acc_stderr\": 0.013548335117860338\n }\n}\n```", "repo_url": "https://huggingface.co/RaduGabriel/SirUkrainian2.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-47-08.297350.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["**/details_harness|winogrande|5_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T14-47-08.297350.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T14_47_08.297350", "path": ["results_2024-02-16T14-47-08.297350.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T14-47-08.297350.parquet"]}]}]} | 2024-02-16T14:49:48+00:00 |
ea23d11aa86606eadae7f8b72fa8eae3863e2545 |
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-dolphin-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-dolphin-sft](https://huggingface.co/CorticalStack/mistral-7b-dolphin-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T14:55:12.739347](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft/blob/main/results_2024-02-16T14-55-12.739347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6226391506011404,
"acc_stderr": 0.032752871244970075,
"acc_norm": 0.6284357429187831,
"acc_norm_stderr": 0.033420600664784014,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.4891471279958395,
"mc2_stderr": 0.014787543186222349
},
"harness|arc:challenge|25": {
"acc": 0.5486348122866894,
"acc_stderr": 0.014542104569955269,
"acc_norm": 0.5725255972696246,
"acc_norm_stderr": 0.014456862944650647
},
"harness|hellaswag|10": {
"acc": 0.6243776140211114,
"acc_stderr": 0.0048329345291207955,
"acc_norm": 0.8301135232025493,
"acc_norm_stderr": 0.0037476555337545158
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462836,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066475,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.01690927688493609,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.01690927688493609
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.014317653708594202,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.014317653708594202
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3139664804469274,
"acc_stderr": 0.01552192393352364,
"acc_norm": 0.3139664804469274,
"acc_norm_stderr": 0.01552192393352364
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43741851368970014,
"acc_stderr": 0.012669813464935722,
"acc_norm": 0.43741851368970014,
"acc_norm_stderr": 0.012669813464935722
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854128,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854128
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675606,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.4891471279958395,
"mc2_stderr": 0.014787543186222349
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126735
},
"harness|gsm8k|5": {
"acc": 0.35784685367702807,
"acc_stderr": 0.013204142536119939
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft | [
"region:us"
] | 2024-02-16T14:57:34+00:00 | {"pretty_name": "Evaluation run of CorticalStack/mistral-7b-dolphin-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-dolphin-sft](https://huggingface.co/CorticalStack/mistral-7b-dolphin-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T14:55:12.739347](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft/blob/main/results_2024-02-16T14-55-12.739347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6226391506011404,\n \"acc_stderr\": 0.032752871244970075,\n \"acc_norm\": 0.6284357429187831,\n \"acc_norm_stderr\": 0.033420600664784014,\n \"mc1\": 0.32802937576499386,\n \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.4891471279958395,\n \"mc2_stderr\": 0.014787543186222349\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5486348122866894,\n \"acc_stderr\": 0.014542104569955269,\n \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650647\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6243776140211114,\n \"acc_stderr\": 0.0048329345291207955,\n \"acc_norm\": 0.8301135232025493,\n \"acc_norm_stderr\": 0.0037476555337545158\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462836,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097417,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097417\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8073394495412844,\n \"acc_stderr\": 0.01690927688493609,\n \"acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.01690927688493609\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n \"acc_stderr\": 0.014317653708594202,\n \"acc_norm\": 0.7994891443167306,\n \"acc_norm_stderr\": 0.014317653708594202\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3139664804469274,\n \"acc_stderr\": 0.01552192393352364,\n \"acc_norm\": 0.3139664804469274,\n \"acc_norm_stderr\": 0.01552192393352364\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43741851368970014,\n \"acc_stderr\": 0.012669813464935722,\n \"acc_norm\": 0.43741851368970014,\n \"acc_norm_stderr\": 0.012669813464935722\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854128,\n \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854128\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.4891471279958395,\n \"mc2_stderr\": 0.014787543186222349\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35784685367702807,\n \"acc_stderr\": 0.013204142536119939\n }\n}\n```", "repo_url": "https://huggingface.co/CorticalStack/mistral-7b-dolphin-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T14-55-12.739347.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["**/details_harness|winogrande|5_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T14-55-12.739347.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T14_55_12.739347", "path": ["results_2024-02-16T14-55-12.739347.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T14-55-12.739347.parquet"]}]}]} | 2024-02-16T14:57:57+00:00 |
bc749e978cd90caa184230f8c9dc7e988ecfc1eb | sawradip/phone-asr-data | [
"region:us"
] | 2024-02-16T15:02:33+00:00 | {"dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "filename", "dtype": "string"}, {"name": "transcription", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 212929368.47523925, "num_examples": 6128}, {"name": "test", "num_bytes": 24229582.980760757, "num_examples": 681}], "download_size": 244822791, "dataset_size": 237158951.456}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-16T15:02:55+00:00 |
|
529c20a2a18eff62e4a14e0a035b962e493901e2 | IvanSSY615/HYchin | [
"region:us"
] | 2024-02-16T15:03:36+00:00 | {} | 2024-02-16T15:03:47+00:00 |
|
0ae5ab553135a00cf69dab2cdbf09d75ec4c119d | OALL/ACVA | [
"region:us"
] | 2024-02-16T15:06:16+00:00 | {"dataset_info": [{"config_name": "Algeria", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 21190, "num_examples": 195}, {"name": "validation", "num_bytes": 467, "num_examples": 5}], "download_size": 7918, "dataset_size": 21657}, {"config_name": "Ancient_Egypt", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 44667, "num_examples": 315}, {"name": "validation", "num_bytes": 712, "num_examples": 5}], "download_size": 19408, "dataset_size": 45379}, {"config_name": "Arab_Empire", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 41076, "num_examples": 265}, {"name": "validation", "num_bytes": 785, "num_examples": 5}], "download_size": 11490, "dataset_size": 41861}, {"config_name": "Arabic_Architecture", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 29911, "num_examples": 195}, {"name": "validation", "num_bytes": 736, "num_examples": 5}], "download_size": 13197, "dataset_size": 30647}, {"config_name": "Arabic_Art", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 24272, "num_examples": 195}, {"name": "validation", "num_bytes": 672, "num_examples": 5}], "download_size": 10495, "dataset_size": 24944}, {"config_name": "Arabic_Astronomy", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 24206, "num_examples": 195}, {"name": "validation", "num_bytes": 592, "num_examples": 5}], "download_size": 6978, "dataset_size": 24798}, {"config_name": "Arabic_Calligraphy", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 35097, "num_examples": 255}, {"name": "validation", "num_bytes": 701, "num_examples": 5}], "download_size": 12047, "dataset_size": 35798}, {"config_name": "Arabic_Ceremony", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 25266, "num_examples": 185}, {"name": "validation", "num_bytes": 708, "num_examples": 5}], "download_size": 12315, "dataset_size": 25974}, {"config_name": "Arabic_Clothing", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 24245, "num_examples": 195}, {"name": "validation", "num_bytes": 622, "num_examples": 5}], "download_size": 8063, "dataset_size": 24867}, {"config_name": "Arabic_Culture", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 27444, "num_examples": 195}, {"name": "validation", "num_bytes": 680, "num_examples": 5}], "download_size": 10513, "dataset_size": 28124}, {"config_name": "Arabic_Food", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 21484, "num_examples": 195}, {"name": "validation", "num_bytes": 507, "num_examples": 5}], "download_size": 7783, "dataset_size": 21991}, {"config_name": "Arabic_Funeral", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 13955, "num_examples": 95}, {"name": "validation", "num_bytes": 745, "num_examples": 5}], "download_size": 7576, "dataset_size": 14700}, {"config_name": "Arabic_Geography", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 16450, "num_examples": 145}, {"name": "validation", "num_bytes": 484, "num_examples": 5}], "download_size": 8762, "dataset_size": 16934}, {"config_name": "Arabic_History", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 23059, "num_examples": 195}, {"name": "validation", "num_bytes": 571, "num_examples": 5}], "download_size": 10052, "dataset_size": 23630}, {"config_name": "Arabic_Language_Origin", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 13050, "num_examples": 95}, {"name": "validation", "num_bytes": 681, "num_examples": 5}], "download_size": 8191, "dataset_size": 13731}, {"config_name": "Arabic_Literature", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 20710, "num_examples": 145}, {"name": "validation", "num_bytes": 633, "num_examples": 5}], "download_size": 8642, "dataset_size": 21343}, {"config_name": "Arabic_Math", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 22913, "num_examples": 195}, {"name": "validation", "num_bytes": 534, "num_examples": 5}], "download_size": 7671, "dataset_size": 23447}, {"config_name": "Arabic_Medicine", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 19490, "num_examples": 145}, {"name": "validation", "num_bytes": 681, "num_examples": 5}], "download_size": 10822, "dataset_size": 20171}, {"config_name": "Arabic_Music", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 17455, "num_examples": 139}, {"name": "validation", "num_bytes": 581, "num_examples": 5}], "download_size": 9859, "dataset_size": 18036}, {"config_name": "Arabic_Ornament", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 22077, "num_examples": 195}, {"name": "validation", "num_bytes": 650, "num_examples": 5}], "download_size": 10006, "dataset_size": 22727}, {"config_name": "Arabic_Philosophy", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 19210, "num_examples": 145}, {"name": "validation", "num_bytes": 644, "num_examples": 5}], "download_size": 7043, "dataset_size": 19854}, {"config_name": "Arabic_Physics_and_Chemistry", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 26423, "num_examples": 195}, {"name": "validation", "num_bytes": 650, "num_examples": 5}], "download_size": 8476, "dataset_size": 27073}, {"config_name": "Arabic_Wedding", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 30413, "num_examples": 195}, {"name": "validation", "num_bytes": 776, "num_examples": 5}], "download_size": 10818, "dataset_size": 31189}, {"config_name": "Bahrain", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 5315, "num_examples": 45}, {"name": "validation", "num_bytes": 634, "num_examples": 5}], "download_size": 7167, "dataset_size": 5949}, {"config_name": "Comoros", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 5337, "num_examples": 45}, {"name": "validation", "num_bytes": 601, "num_examples": 5}], "download_size": 6624, "dataset_size": 5938}, {"config_name": "Egypt_modern", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 10346, "num_examples": 95}, {"name": "validation", "num_bytes": 620, "num_examples": 5}], "download_size": 8766, "dataset_size": 10966}, {"config_name": "InfluenceFromAncientEgypt", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 32183, "num_examples": 195}, {"name": "validation", "num_bytes": 823, "num_examples": 5}], "download_size": 11846, "dataset_size": 33006}, {"config_name": "InfluenceFromByzantium", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 22701, "num_examples": 145}, {"name": "validation", "num_bytes": 797, "num_examples": 5}], "download_size": 9634, "dataset_size": 23498}, {"config_name": "InfluenceFromChina", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 24557, "num_examples": 195}, {"name": "validation", "num_bytes": 621, "num_examples": 5}], "download_size": 9229, "dataset_size": 25178}, {"config_name": "InfluenceFromGreece", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 29809, "num_examples": 195}, {"name": "validation", "num_bytes": 699, "num_examples": 5}], "download_size": 9532, "dataset_size": 30508}, {"config_name": "InfluenceFromIslam", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 21113, "num_examples": 145}, {"name": "validation", "num_bytes": 749, "num_examples": 5}], "download_size": 12010, "dataset_size": 21862}, {"config_name": "InfluenceFromPersia", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 24628, "num_examples": 175}, {"name": "validation", "num_bytes": 666, "num_examples": 5}], "download_size": 11766, "dataset_size": 25294}, {"config_name": "InfluenceFromRome", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 25024, "num_examples": 195}, {"name": "validation", "num_bytes": 684, "num_examples": 5}], "download_size": 10184, "dataset_size": 25708}, {"config_name": "Iraq", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 8576, "num_examples": 85}, {"name": "validation", "num_bytes": 487, "num_examples": 5}], "download_size": 7397, "dataset_size": 9063}, {"config_name": "Islam_Education", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 21483, "num_examples": 195}, {"name": "validation", "num_bytes": 542, "num_examples": 5}], "download_size": 9604, "dataset_size": 22025}, {"config_name": "Islam_branches_and_schools", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 26534, "num_examples": 175}, {"name": "validation", "num_bytes": 781, "num_examples": 5}], "download_size": 7979, "dataset_size": 27315}, {"config_name": "Islamic_law_system", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 27431, "num_examples": 195}, {"name": "validation", "num_bytes": 709, "num_examples": 5}], "download_size": 11025, "dataset_size": 28140}, {"config_name": "Jordan", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 3982, "num_examples": 45}, {"name": "validation", "num_bytes": 457, "num_examples": 5}], "download_size": 5934, "dataset_size": 4439}, {"config_name": "Kuwait", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4769, "num_examples": 45}, {"name": "validation", "num_bytes": 503, "num_examples": 5}], "download_size": 6114, "dataset_size": 5272}, {"config_name": "Lebanon", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4763, "num_examples": 45}, {"name": "validation", "num_bytes": 473, "num_examples": 5}], "download_size": 6483, "dataset_size": 5236}, {"config_name": "Libya", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4476, "num_examples": 45}, {"name": "validation", "num_bytes": 514, "num_examples": 5}], "download_size": 6288, "dataset_size": 4990}, {"config_name": "Mauritania", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 5291, "num_examples": 45}, {"name": "validation", "num_bytes": 617, "num_examples": 5}], "download_size": 6839, "dataset_size": 5908}, {"config_name": "Mesopotamia_civilization", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 23934, "num_examples": 155}, {"name": "validation", "num_bytes": 776, "num_examples": 5}], "download_size": 11533, "dataset_size": 24710}, {"config_name": "Morocco", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4875, "num_examples": 45}, {"name": "validation", "num_bytes": 561, "num_examples": 5}], "download_size": 6800, "dataset_size": 5436}, {"config_name": "Oman", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 5127, "num_examples": 45}, {"name": "validation", "num_bytes": 518, "num_examples": 5}], "download_size": 6782, "dataset_size": 5645}, {"config_name": "Palestine", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 9261, "num_examples": 85}, {"name": "validation", "num_bytes": 543, "num_examples": 5}], "download_size": 8626, "dataset_size": 9804}, {"config_name": "Qatar", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4967, "num_examples": 45}, {"name": "validation", "num_bytes": 601, "num_examples": 5}], "download_size": 7076, "dataset_size": 5568}, {"config_name": "Saudi_Arabia", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 26540, "num_examples": 195}, {"name": "validation", "num_bytes": 740, "num_examples": 5}], "download_size": 8313, "dataset_size": 27280}, {"config_name": "Somalia", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 5242, "num_examples": 45}, {"name": "validation", "num_bytes": 562, "num_examples": 5}], "download_size": 6546, "dataset_size": 5804}, {"config_name": "Sudan", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4880, "num_examples": 45}, {"name": "validation", "num_bytes": 520, "num_examples": 5}], "download_size": 6295, "dataset_size": 5400}, {"config_name": "Syria", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4645, "num_examples": 45}, {"name": "validation", "num_bytes": 584, "num_examples": 5}], "download_size": 6458, "dataset_size": 5229}, {"config_name": "Tunisia", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4775, "num_examples": 45}, {"name": "validation", "num_bytes": 545, "num_examples": 5}], "download_size": 5910, "dataset_size": 5320}, {"config_name": "United_Arab_Emirates", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 12793, "num_examples": 85}, {"name": "validation", "num_bytes": 777, "num_examples": 5}], "download_size": 8828, "dataset_size": 13570}, {"config_name": "Yemen", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 937, "num_examples": 10}, {"name": "validation", "num_bytes": 484, "num_examples": 5}], "download_size": 5016, "dataset_size": 1421}, {"config_name": "communication", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 55967, "num_examples": 364}, {"name": "validation", "num_bytes": 833, "num_examples": 5}], "download_size": 14248, "dataset_size": 56800}, {"config_name": "computer_and_phone", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 53693, "num_examples": 295}, {"name": "validation", "num_bytes": 912, "num_examples": 5}], "download_size": 16657, "dataset_size": 54605}, {"config_name": "daily_life", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 53067, "num_examples": 337}, {"name": "validation", "num_bytes": 806, "num_examples": 5}], "download_size": 13780, "dataset_size": 53873}, {"config_name": "entertainment", "features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 41054, "num_examples": 295}, {"name": "validation", "num_bytes": 729, "num_examples": 5}], "download_size": 12285, "dataset_size": 41783}], "configs": [{"config_name": "Algeria", "data_files": [{"split": "test", "path": "Algeria/test-*"}, {"split": "validation", "path": "Algeria/validation-*"}]}, {"config_name": "Ancient_Egypt", "data_files": [{"split": "test", "path": "Ancient_Egypt/test-*"}, {"split": "validation", "path": "Ancient_Egypt/validation-*"}]}, {"config_name": "Arab_Empire", "data_files": [{"split": "test", "path": "Arab_Empire/test-*"}, {"split": "validation", "path": "Arab_Empire/validation-*"}]}, {"config_name": "Arabic_Architecture", "data_files": [{"split": "test", "path": "Arabic_Architecture/test-*"}, {"split": "validation", "path": "Arabic_Architecture/validation-*"}]}, {"config_name": "Arabic_Art", "data_files": [{"split": "test", "path": "Arabic_Art/test-*"}, {"split": "validation", "path": "Arabic_Art/validation-*"}]}, {"config_name": "Arabic_Astronomy", "data_files": [{"split": "test", "path": "Arabic_Astronomy/test-*"}, {"split": "validation", "path": "Arabic_Astronomy/validation-*"}]}, {"config_name": "Arabic_Calligraphy", "data_files": [{"split": "test", "path": "Arabic_Calligraphy/test-*"}, {"split": "validation", "path": "Arabic_Calligraphy/validation-*"}]}, {"config_name": "Arabic_Ceremony", "data_files": [{"split": "test", "path": "Arabic_Ceremony/test-*"}, {"split": "validation", "path": "Arabic_Ceremony/validation-*"}]}, {"config_name": "Arabic_Clothing", "data_files": [{"split": "test", "path": "Arabic_Clothing/test-*"}, {"split": "validation", "path": "Arabic_Clothing/validation-*"}]}, {"config_name": "Arabic_Culture", "data_files": [{"split": "test", "path": "Arabic_Culture/test-*"}, {"split": "validation", "path": "Arabic_Culture/validation-*"}]}, {"config_name": "Arabic_Food", "data_files": [{"split": "test", "path": "Arabic_Food/test-*"}, {"split": "validation", "path": "Arabic_Food/validation-*"}]}, {"config_name": "Arabic_Funeral", "data_files": [{"split": "test", "path": "Arabic_Funeral/test-*"}, {"split": "validation", "path": "Arabic_Funeral/validation-*"}]}, {"config_name": "Arabic_Geography", "data_files": [{"split": "test", "path": "Arabic_Geography/test-*"}, {"split": "validation", "path": "Arabic_Geography/validation-*"}]}, {"config_name": "Arabic_History", "data_files": [{"split": "test", "path": "Arabic_History/test-*"}, {"split": "validation", "path": "Arabic_History/validation-*"}]}, {"config_name": "Arabic_Language_Origin", "data_files": [{"split": "test", "path": "Arabic_Language_Origin/test-*"}, {"split": "validation", "path": "Arabic_Language_Origin/validation-*"}]}, {"config_name": "Arabic_Literature", "data_files": [{"split": "test", "path": "Arabic_Literature/test-*"}, {"split": "validation", "path": "Arabic_Literature/validation-*"}]}, {"config_name": "Arabic_Math", "data_files": [{"split": "test", "path": "Arabic_Math/test-*"}, {"split": "validation", "path": "Arabic_Math/validation-*"}]}, {"config_name": "Arabic_Medicine", "data_files": [{"split": "test", "path": "Arabic_Medicine/test-*"}, {"split": "validation", "path": "Arabic_Medicine/validation-*"}]}, {"config_name": "Arabic_Music", "data_files": [{"split": "test", "path": "Arabic_Music/test-*"}, {"split": "validation", "path": "Arabic_Music/validation-*"}]}, {"config_name": "Arabic_Ornament", "data_files": [{"split": "test", "path": "Arabic_Ornament/test-*"}, {"split": "validation", "path": "Arabic_Ornament/validation-*"}]}, {"config_name": "Arabic_Philosophy", "data_files": [{"split": "test", "path": "Arabic_Philosophy/test-*"}, {"split": "validation", "path": "Arabic_Philosophy/validation-*"}]}, {"config_name": "Arabic_Physics_and_Chemistry", "data_files": [{"split": "test", "path": "Arabic_Physics_and_Chemistry/test-*"}, {"split": "validation", "path": "Arabic_Physics_and_Chemistry/validation-*"}]}, {"config_name": "Arabic_Wedding", "data_files": [{"split": "test", "path": "Arabic_Wedding/test-*"}, {"split": "validation", "path": "Arabic_Wedding/validation-*"}]}, {"config_name": "Bahrain", "data_files": [{"split": "test", "path": "Bahrain/test-*"}, {"split": "validation", "path": "Bahrain/validation-*"}]}, {"config_name": "Comoros", "data_files": [{"split": "test", "path": "Comoros/test-*"}, {"split": "validation", "path": "Comoros/validation-*"}]}, {"config_name": "Egypt_modern", "data_files": [{"split": "test", "path": "Egypt_modern/test-*"}, {"split": "validation", "path": "Egypt_modern/validation-*"}]}, {"config_name": "InfluenceFromAncientEgypt", "data_files": [{"split": "test", "path": "InfluenceFromAncientEgypt/test-*"}, {"split": "validation", "path": "InfluenceFromAncientEgypt/validation-*"}]}, {"config_name": "InfluenceFromByzantium", "data_files": [{"split": "test", "path": "InfluenceFromByzantium/test-*"}, {"split": "validation", "path": "InfluenceFromByzantium/validation-*"}]}, {"config_name": "InfluenceFromChina", "data_files": [{"split": "test", "path": "InfluenceFromChina/test-*"}, {"split": "validation", "path": "InfluenceFromChina/validation-*"}]}, {"config_name": "InfluenceFromGreece", "data_files": [{"split": "test", "path": "InfluenceFromGreece/test-*"}, {"split": "validation", "path": "InfluenceFromGreece/validation-*"}]}, {"config_name": "InfluenceFromIslam", "data_files": [{"split": "test", "path": "InfluenceFromIslam/test-*"}, {"split": "validation", "path": "InfluenceFromIslam/validation-*"}]}, {"config_name": "InfluenceFromPersia", "data_files": [{"split": "test", "path": "InfluenceFromPersia/test-*"}, {"split": "validation", "path": "InfluenceFromPersia/validation-*"}]}, {"config_name": "InfluenceFromRome", "data_files": [{"split": "test", "path": "InfluenceFromRome/test-*"}, {"split": "validation", "path": "InfluenceFromRome/validation-*"}]}, {"config_name": "Iraq", "data_files": [{"split": "test", "path": "Iraq/test-*"}, {"split": "validation", "path": "Iraq/validation-*"}]}, {"config_name": "Islam_Education", "data_files": [{"split": "test", "path": "Islam_Education/test-*"}, {"split": "validation", "path": "Islam_Education/validation-*"}]}, {"config_name": "Islam_branches_and_schools", "data_files": [{"split": "test", "path": "Islam_branches_and_schools/test-*"}, {"split": "validation", "path": "Islam_branches_and_schools/validation-*"}]}, {"config_name": "Islamic_law_system", "data_files": [{"split": "test", "path": "Islamic_law_system/test-*"}, {"split": "validation", "path": "Islamic_law_system/validation-*"}]}, {"config_name": "Jordan", "data_files": [{"split": "test", "path": "Jordan/test-*"}, {"split": "validation", "path": "Jordan/validation-*"}]}, {"config_name": "Kuwait", "data_files": [{"split": "test", "path": "Kuwait/test-*"}, {"split": "validation", "path": "Kuwait/validation-*"}]}, {"config_name": "Lebanon", "data_files": [{"split": "test", "path": "Lebanon/test-*"}, {"split": "validation", "path": "Lebanon/validation-*"}]}, {"config_name": "Libya", "data_files": [{"split": "test", "path": "Libya/test-*"}, {"split": "validation", "path": "Libya/validation-*"}]}, {"config_name": "Mauritania", "data_files": [{"split": "test", "path": "Mauritania/test-*"}, {"split": "validation", "path": "Mauritania/validation-*"}]}, {"config_name": "Mesopotamia_civilization", "data_files": [{"split": "test", "path": "Mesopotamia_civilization/test-*"}, {"split": "validation", "path": "Mesopotamia_civilization/validation-*"}]}, {"config_name": "Morocco", "data_files": [{"split": "test", "path": "Morocco/test-*"}, {"split": "validation", "path": "Morocco/validation-*"}]}, {"config_name": "Oman", "data_files": [{"split": "test", "path": "Oman/test-*"}, {"split": "validation", "path": "Oman/validation-*"}]}, {"config_name": "Palestine", "data_files": [{"split": "test", "path": "Palestine/test-*"}, {"split": "validation", "path": "Palestine/validation-*"}]}, {"config_name": "Qatar", "data_files": [{"split": "test", "path": "Qatar/test-*"}, {"split": "validation", "path": "Qatar/validation-*"}]}, {"config_name": "Saudi_Arabia", "data_files": [{"split": "test", "path": "Saudi_Arabia/test-*"}, {"split": "validation", "path": "Saudi_Arabia/validation-*"}]}, {"config_name": "Somalia", "data_files": [{"split": "test", "path": "Somalia/test-*"}, {"split": "validation", "path": "Somalia/validation-*"}]}, {"config_name": "Sudan", "data_files": [{"split": "test", "path": "Sudan/test-*"}, {"split": "validation", "path": "Sudan/validation-*"}]}, {"config_name": "Syria", "data_files": [{"split": "test", "path": "Syria/test-*"}, {"split": "validation", "path": "Syria/validation-*"}]}, {"config_name": "Tunisia", "data_files": [{"split": "test", "path": "Tunisia/test-*"}, {"split": "validation", "path": "Tunisia/validation-*"}]}, {"config_name": "United_Arab_Emirates", "data_files": [{"split": "test", "path": "United_Arab_Emirates/test-*"}, {"split": "validation", "path": "United_Arab_Emirates/validation-*"}]}, {"config_name": "Yemen", "data_files": [{"split": "test", "path": "Yemen/test-*"}, {"split": "validation", "path": "Yemen/validation-*"}]}, {"config_name": "communication", "data_files": [{"split": "test", "path": "communication/test-*"}, {"split": "validation", "path": "communication/validation-*"}]}, {"config_name": "computer_and_phone", "data_files": [{"split": "test", "path": "computer_and_phone/test-*"}, {"split": "validation", "path": "computer_and_phone/validation-*"}]}, {"config_name": "daily_life", "data_files": [{"split": "test", "path": "daily_life/test-*"}, {"split": "validation", "path": "daily_life/validation-*"}]}, {"config_name": "entertainment", "data_files": [{"split": "test", "path": "entertainment/test-*"}, {"split": "validation", "path": "entertainment/validation-*"}]}]} | 2024-02-16T15:10:44+00:00 |
|
79cce594c3b54d6014543854a3c5c8f0d6f6488b |
# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian2.0DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RaduGabriel/SirUkrainian2.0DPO](https://huggingface.co/RaduGabriel/SirUkrainian2.0DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T15:05:17.320355](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0DPO/blob/main/results_2024-02-16T15-05-17.320355.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6121147220670339,
"acc_stderr": 0.03309664409492137,
"acc_norm": 0.61597978296096,
"acc_norm_stderr": 0.03377396962261278,
"mc1": 0.48225214198286415,
"mc1_stderr": 0.017492470843075363,
"mc2": 0.6508219588046045,
"mc2_stderr": 0.015061453784907832
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946716,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175452
},
"harness|hellaswag|10": {
"acc": 0.6541525592511452,
"acc_stderr": 0.004746716805735753,
"acc_norm": 0.8351921927902808,
"acc_norm_stderr": 0.003702487662126949
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.02556060472102289,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.02556060472102289
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072386,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072386
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.0247843169421564,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.0247843169421564
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501602,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501602
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693257,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.01471168438613996,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.01471168438613996
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879695,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879695
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48044692737430167,
"acc_stderr": 0.016709709877662,
"acc_norm": 0.48044692737430167,
"acc_norm_stderr": 0.016709709877662
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697046,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697046
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.029624663581159696,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.029624663581159696
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.019675808135281515,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.019675808135281515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48225214198286415,
"mc1_stderr": 0.017492470843075363,
"mc2": 0.6508219588046045,
"mc2_stderr": 0.015061453784907832
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626918
},
"harness|gsm8k|5": {
"acc": 0.41925701288855194,
"acc_stderr": 0.01359172095904211
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0DPO | [
"region:us"
] | 2024-02-16T15:07:36+00:00 | {"pretty_name": "Evaluation run of RaduGabriel/SirUkrainian2.0DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [RaduGabriel/SirUkrainian2.0DPO](https://huggingface.co/RaduGabriel/SirUkrainian2.0DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T15:05:17.320355](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0DPO/blob/main/results_2024-02-16T15-05-17.320355.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6121147220670339,\n \"acc_stderr\": 0.03309664409492137,\n \"acc_norm\": 0.61597978296096,\n \"acc_norm_stderr\": 0.03377396962261278,\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.017492470843075363,\n \"mc2\": 0.6508219588046045,\n \"mc2_stderr\": 0.015061453784907832\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946716,\n \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6541525592511452,\n \"acc_stderr\": 0.004746716805735753,\n \"acc_norm\": 0.8351921927902808,\n \"acc_norm_stderr\": 0.003702487662126949\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n \"acc_stderr\": 0.02556060472102289,\n \"acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.02556060472102289\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072386,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072386\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.0247843169421564,\n \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.0247843169421564\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501602,\n \"acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501602\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693257,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693257\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n \"acc_stderr\": 0.01471168438613996,\n \"acc_norm\": 0.7841634738186463,\n \"acc_norm_stderr\": 0.01471168438613996\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879695,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879695\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48044692737430167,\n \"acc_stderr\": 0.016709709877662,\n \"acc_norm\": 0.48044692737430167,\n \"acc_norm_stderr\": 0.016709709877662\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.026596782287697046,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.026596782287697046\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.029624663581159696,\n \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.029624663581159696\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281515,\n \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281515\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683906,\n \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683906\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.017492470843075363,\n \"mc2\": 0.6508219588046045,\n \"mc2_stderr\": 0.015061453784907832\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626918\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41925701288855194,\n \"acc_stderr\": 0.01359172095904211\n }\n}\n```", "repo_url": "https://huggingface.co/RaduGabriel/SirUkrainian2.0DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|arc:challenge|25_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|gsm8k|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hellaswag|10_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T15-05-17.320355.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["**/details_harness|winogrande|5_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T15-05-17.320355.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T15_05_17.320355", "path": ["results_2024-02-16T15-05-17.320355.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T15-05-17.320355.parquet"]}]}]} | 2024-02-16T15:07:59+00:00 |
85bd8f0cc8326de958f445c4f059fafe66d23044 | # Dataset Card for "finance_dataset_subset_binirized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jan-hq/finance_dataset_subset_binarized | [
"region:us"
] | 2024-02-16T15:10:16+00:00 | {"dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 56318865.41867357, "num_examples": 55352}, {"name": "test", "num_bytes": 6258443.076858308, "num_examples": 6151}], "download_size": 46540189, "dataset_size": 62577308.49553187}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-16T15:26:09+00:00 |
ece087aca234ab8de35883f9648ac5f04176082c |
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-slimorca-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-slimorca-sft](https://huggingface.co/CorticalStack/mistral-7b-slimorca-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__mistral-7b-slimorca-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T15:13:14.245418](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-slimorca-sft/blob/main/results_2024-02-16T15-13-14.245418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6048142926851883,
"acc_stderr": 0.032906396487846115,
"acc_norm": 0.6105167673880798,
"acc_norm_stderr": 0.03358845031709559,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762557,
"mc2": 0.5018293862426123,
"mc2_stderr": 0.014695173813842227
},
"harness|arc:challenge|25": {
"acc": 0.552901023890785,
"acc_stderr": 0.014529380160526843,
"acc_norm": 0.5853242320819113,
"acc_norm_stderr": 0.014397070564409174
},
"harness|hellaswag|10": {
"acc": 0.6249751045608445,
"acc_stderr": 0.004831399218500236,
"acc_norm": 0.8316072495518821,
"acc_norm_stderr": 0.003734498979207306
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.02556060472102288,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.02556060472102288
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198892,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198892
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139746,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139746
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.024838811988033165,
"acc_norm": 0.6,
"acc_norm_stderr": 0.024838811988033165
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.03169380235712997,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.03169380235712997
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7871559633027523,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.7871559633027523,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145624,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.04284467968052194,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.04284467968052194
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406943,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406943
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2670391061452514,
"acc_stderr": 0.01479650262256256,
"acc_norm": 0.2670391061452514,
"acc_norm_stderr": 0.01479650262256256
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630453,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630453
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567655,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567655
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928006,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.019524316744866356,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.019524316744866356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762557,
"mc2": 0.5018293862426123,
"mc2_stderr": 0.014695173813842227
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710683
},
"harness|gsm8k|5": {
"acc": 0.32297194844579225,
"acc_stderr": 0.012880360794851825
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CorticalStack__mistral-7b-slimorca-sft | [
"region:us"
] | 2024-02-16T15:15:31+00:00 | {"pretty_name": "Evaluation run of CorticalStack/mistral-7b-slimorca-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-slimorca-sft](https://huggingface.co/CorticalStack/mistral-7b-slimorca-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__mistral-7b-slimorca-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T15:13:14.245418](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-slimorca-sft/blob/main/results_2024-02-16T15-13-14.245418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6048142926851883,\n \"acc_stderr\": 0.032906396487846115,\n \"acc_norm\": 0.6105167673880798,\n \"acc_norm_stderr\": 0.03358845031709559,\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.5018293862426123,\n \"mc2_stderr\": 0.014695173813842227\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.552901023890785,\n \"acc_stderr\": 0.014529380160526843,\n \"acc_norm\": 0.5853242320819113,\n \"acc_norm_stderr\": 0.014397070564409174\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6249751045608445,\n \"acc_stderr\": 0.004831399218500236,\n \"acc_norm\": 0.8316072495518821,\n \"acc_norm_stderr\": 0.003734498979207306\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.03988903703336284,\n \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.03988903703336284\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n \"acc_stderr\": 0.02556060472102288,\n \"acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.02556060472102288\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198892,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198892\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139746,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139746\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.024838811988033165,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.024838811988033165\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.03169380235712997,\n \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.03169380235712997\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145624,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145624\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.04284467968052194,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.04284467968052194\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406943,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406943\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2670391061452514,\n \"acc_stderr\": 0.01479650262256256,\n \"acc_norm\": 0.2670391061452514,\n \"acc_norm_stderr\": 0.01479650262256256\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630453,\n \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630453\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n \"acc_stderr\": 0.012671902782567655,\n \"acc_norm\": 0.4380704041720991,\n \"acc_norm_stderr\": 0.012671902782567655\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928006,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928006\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.019524316744866356,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.019524316744866356\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.5018293862426123,\n \"mc2_stderr\": 0.014695173813842227\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710683\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32297194844579225,\n \"acc_stderr\": 0.012880360794851825\n }\n}\n```", "repo_url": "https://huggingface.co/CorticalStack/mistral-7b-slimorca-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|arc:challenge|25_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|gsm8k|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hellaswag|10_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T15-13-14.245418.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["**/details_harness|winogrande|5_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T15-13-14.245418.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T15_13_14.245418", "path": ["results_2024-02-16T15-13-14.245418.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T15-13-14.245418.parquet"]}]}]} | 2024-02-16T15:15:55+00:00 |
5e176fb0827b9762f20da5bf395486878b79ca6f | ## INTERS: Unlocking the Power of Large Language Models in Search with Instruction Tuning
**Authors**: Yutao Zhu, Peitian Zhang, Chenghao Zhang, Yifei Chen, Binyu Xie, Zhicheng Dou, Zheng Liu, and Ji-Rong Wen
⭐ Other data files and the fine-tuned models are uploading. Due to the network latency, it will take several days!
<p>
📃 <a href="https://arxiv.org/abs/2401.06532">ArXiv Paper</a>
•
💡 <a href="https://github.com/DaoD/INTERS">GitHub</a>
</p>
## Introduction
Large language models (LLMs) have demonstrated impressive capabilities in various natural language processing tasks. Despite this, their application to information retrieval (IR) tasks is still challenging due to the infrequent occurrence of many IR-specific concepts in natural language. While prompt-based methods can provide task descriptions to LLMs, they often fall short in facilitating a comprehensive understanding and execution of IR tasks, thereby limiting LLMs' applicability. To address this gap, in this work, we explore the potential of instruction tuning to enhance LLMs' proficiency in IR tasks. We introduce a novel instruction tuning dataset, INTERS, encompassing 20 tasks across three fundamental IR categories: query understanding, document understanding, and query-document relationship understanding. The data are derived from 43 distinct datasets with manually written templates. Our empirical results reveal that INTERS significantly boosts the performance of various publicly available LLMs, such as LLaMA, Mistral, and Phi, in IR tasks. Furthermore, we conduct extensive experiments to analyze the effects of instruction design, template diversity, few-shot demonstrations, and the volume of instructions on performance.
## File List
- train.jsonl: The training set for the in-domain evaluation scenario.
- dev-qu-du-zero-shot/*: The dev set (query understanding tasks and document understanding tasks) for the zero-shot evaluation scenario.
- test-qu-du-zero-shot/*: The test set (query understanding tasks and document understanding tasks) for the zero-shot evaluation scenario.
## File Format
Each line of the file is a json dict with the following structure:
```
{
"prompt": the input for LLMs,
"completion": the output for LLMs,
"source": the data source,
}
``` | yutaozhu94/INTERS | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:en",
"license:cc-by-sa-4.0",
"arxiv:2401.06532",
"region:us"
] | 2024-02-16T15:16:51+00:00 | {"language": ["en"], "license": "cc-by-sa-4.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation"]} | 2024-02-17T04:08:10+00:00 |
61e0c9b19b33539680be6c4f805901c4efca11a4 | hassu619/Hassu-5B | [
"region:us"
] | 2024-02-16T15:17:52+00:00 | {} | 2024-02-16T15:17:52+00:00 |
|
70213e99c163571724cccf6990d4fb076381bf8e | LucasPoker00/yMatheusIA | [
"region:us"
] | 2024-02-16T15:18:38+00:00 | {} | 2024-02-16T15:18:54+00:00 |
|
bc7a29346dbcaa16a8cd883b1f3e681ab2b7ff2a | OALL/Arabic_EXAMS | [
"region:us"
] | 2024-02-16T15:24:11+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 167521, "num_examples": 537}, {"name": "validation", "num_bytes": 7803, "num_examples": 25}], "download_size": 99036, "dataset_size": 175324}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-02-16T20:54:12+00:00 |
|
822e42d0483c20b9ec9bbc12afdbc4803874554a | Recag/Rp_CommonC_726_1 | [
"region:us"
] | 2024-02-16T15:25:43+00:00 | {} | 2024-02-16T15:30:37+00:00 |
|
0c55975784cb284f5317efbd150e626450120b5b | Recag/Rp_CommonC_726_2 | [
"region:us"
] | 2024-02-16T15:26:02+00:00 | {} | 2024-02-16T15:31:32+00:00 |
|
8da774809e40971cde9ad643d1ad8ffba8550bf8 | Recag/Rp_CommonC_726_3 | [
"region:us"
] | 2024-02-16T15:27:42+00:00 | {} | 2024-02-16T15:33:05+00:00 |
|
5e362c1416a7a7186bc0cff6758876b0ff90e2cd |
🔧 This is still under construction 🔧
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | LucasWeber/tinyMMLU | [
"region:us"
] | 2024-02-16T15:28:43+00:00 | {"dataset_info": [{"config_name": "all", "features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 161000625, "num_examples": 99842}, {"name": "test", "num_bytes": 49618.6654322746, "num_examples": 100}, {"name": "validation", "num_bytes": 763484, "num_examples": 1531}, {"name": "dev", "num_bytes": 125353, "num_examples": 285}], "download_size": 48030609, "dataset_size": 161939080.66543227}, {"config_name": "default", "features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 161000625, "num_examples": 99842}, {"name": "test", "num_bytes": 49618.6654322746, "num_examples": 100}, {"name": "validation", "num_bytes": 763484, "num_examples": 1531}, {"name": "dev", "num_bytes": 125353, "num_examples": 285}], "download_size": 48030609, "dataset_size": 161939080.66543227}], "configs": [{"config_name": "all", "data_files": [{"split": "auxiliary_train", "path": "all/auxiliary_train-*"}, {"split": "test", "path": "all/test-*"}, {"split": "validation", "path": "all/validation-*"}, {"split": "dev", "path": "all/dev-*"}]}, {"config_name": "default", "data_files": [{"split": "auxiliary_train", "path": "data/auxiliary_train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "dev", "path": "data/dev-*"}]}]} | 2024-02-16T18:47:35+00:00 |
bff8ba4d9a86891cf648d15cbff6255322eccef2 |
# Dataset Card for Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [giraffe176/Open_Maid_Samantha_Hermes_Orca](https://huggingface.co/giraffe176/Open_Maid_Samantha_Hermes_Orca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T15:26:57.957451](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca/blob/main/results_2024-02-16T15-26-57.957451.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6479153990110535,
"acc_stderr": 0.03205375637884698,
"acc_norm": 0.6497903447396889,
"acc_norm_stderr": 0.03269663569413109,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5391206181068439,
"mc2_stderr": 0.015128579519405813
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.014144193471893449,
"acc_norm": 0.6680887372013652,
"acc_norm_stderr": 0.013760988200880533
},
"harness|hellaswag|10": {
"acc": 0.6685919139613623,
"acc_stderr": 0.004697573962169426,
"acc_norm": 0.8582951603266281,
"acc_norm_stderr": 0.003480344142139517
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.02366421667164251,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.02366421667164251
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646507,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.015461169002371544,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.015461169002371544
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658537,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658537
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079064,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079064
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5391206181068439,
"mc2_stderr": 0.015128579519405813
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569572
},
"harness|gsm8k|5": {
"acc": 0.6141015921152388,
"acc_stderr": 0.013409077471319175
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca | [
"region:us"
] | 2024-02-16T15:29:16+00:00 | {"pretty_name": "Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca", "dataset_summary": "Dataset automatically created during the evaluation run of model [giraffe176/Open_Maid_Samantha_Hermes_Orca](https://huggingface.co/giraffe176/Open_Maid_Samantha_Hermes_Orca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T15:26:57.957451](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca/blob/main/results_2024-02-16T15-26-57.957451.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6479153990110535,\n \"acc_stderr\": 0.03205375637884698,\n \"acc_norm\": 0.6497903447396889,\n \"acc_norm_stderr\": 0.03269663569413109,\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5391206181068439,\n \"mc2_stderr\": 0.015128579519405813\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893449,\n \"acc_norm\": 0.6680887372013652,\n \"acc_norm_stderr\": 0.013760988200880533\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6685919139613623,\n \"acc_stderr\": 0.004697573962169426,\n \"acc_norm\": 0.8582951603266281,\n \"acc_norm_stderr\": 0.003480344142139517\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.041307408795554966,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.041307408795554966\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.02366421667164251,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.02366421667164251\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n \"acc_stderr\": 0.015461169002371544,\n \"acc_norm\": 0.3094972067039106,\n \"acc_norm_stderr\": 0.015461169002371544\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658537,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658537\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079064,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079064\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825365,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825365\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5391206181068439,\n \"mc2_stderr\": 0.015128579519405813\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569572\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6141015921152388,\n \"acc_stderr\": 0.013409077471319175\n }\n}\n```", "repo_url": "https://huggingface.co/giraffe176/Open_Maid_Samantha_Hermes_Orca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|arc:challenge|25_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|gsm8k|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hellaswag|10_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T15-26-57.957451.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["**/details_harness|winogrande|5_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T15-26-57.957451.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T15_26_57.957451", "path": ["results_2024-02-16T15-26-57.957451.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T15-26-57.957451.parquet"]}]}]} | 2024-02-16T15:29:40+00:00 |
df3cc7982dd305dd1660de79a8c0bb61b216303f | coastalcph/wiqueen | [
"region:us"
] | 2024-02-16T15:30:38+00:00 | {} | 2024-02-16T15:30:38+00:00 |
|
5c71afd8af56d7474da86d6bf82a38fa9a88f5ef | LucasPoker00/yMatheusKK | [
"license:openrail",
"region:us"
] | 2024-02-16T15:31:47+00:00 | {"license": "openrail"} | 2024-02-16T15:34:53+00:00 |
|
d385cb6b5c33ee14598aeb9a1a0fac333504f983 | Recag/Rp_CommonC_727_1 | [
"region:us"
] | 2024-02-16T15:32:13+00:00 | {} | 2024-02-16T15:38:53+00:00 |
|
802921354f5e086ae073d221015f5d19fdf2914b | Recag/Rp_CommonC_727_2 | [
"region:us"
] | 2024-02-16T15:32:37+00:00 | {} | 2024-02-16T15:40:56+00:00 |
|
2485e0880b5ec0403e767bfe6ccc3d18ede4461a | kakshak/optimoz | [
"license:mit",
"region:us"
] | 2024-02-16T15:35:21+00:00 | {"license": "mit"} | 2024-02-16T15:56:36+00:00 |
|
a3923a0893c6d8a19e95dc3feddcb63058688d29 | Recag/Rp_CommonC_727_3 | [
"region:us"
] | 2024-02-16T15:35:38+00:00 | {} | 2024-02-16T15:41:30+00:00 |
|
e25b0bfcb3eb9eb3af8dc1a77a6e4c2f18e032da | rungalileo/nli-training-data-relevance | [
"region:us"
] | 2024-02-16T15:37:37+00:00 | {"dataset_info": {"features": [{"name": "docs", "list": [{"name": "attributed", "dtype": "bool"}, {"name": "doc_level_adherence", "dtype": "int64"}, {"name": "doc_level_completeness", "dtype": "float64"}, {"name": "relevance_explanation", "dtype": "string"}, {"name": "relevant_substring", "dtype": "string"}, {"name": "relevant_substrings", "list": [{"name": "end_ix", "dtype": "int64"}, {"name": "start_ix", "dtype": "int64"}]}, {"name": "text", "dtype": "string"}, {"name": "utilized_substrings", "list": [{"name": "end_ix", "dtype": "int64"}, {"name": "start_ix", "dtype": "int64"}]}]}, {"name": "response", "dtype": "string"}, {"name": "adherent", "dtype": "bool"}, {"name": "adherence_name", "dtype": "string"}, {"name": "attribution_name", "dtype": "string"}, {"name": "source_dsm", "sequence": "string"}, {"name": "source_id", "dtype": "string"}, {"name": "source_rerun_ix", "dtype": "int64"}, {"name": "extra_annotations", "struct": [{"name": "gpt-3.5-turbo-1106_groundedness_detailed_cot_numbered_n_3", "sequence": "bool"}]}, {"name": "question", "dtype": "string"}, {"name": "completeness", "dtype": "float64"}, {"name": "completeness_explanation", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 45766742, "num_examples": 8367}, {"name": "validation", "num_bytes": 13416093, "num_examples": 2629}, {"name": "test", "num_bytes": 14046955, "num_examples": 2644}], "download_size": 32784097, "dataset_size": 73229790}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-16T15:37:46+00:00 |
|
fc5a9dd9ca7fc655f406d684813d9e105362ea59 | Recag/Rp_CommonC_728_1 | [
"region:us"
] | 2024-02-16T15:40:06+00:00 | {} | 2024-02-16T15:46:41+00:00 |
|
fcaa9a24e4b43f3b5eea256fa97f2a8000e33e3d | Abdu07/Malawi-Public-Health-Systems | [
"license:mit",
"region:us"
] | 2024-02-16T15:40:16+00:00 | {"license": "mit"} | 2024-02-16T15:41:22+00:00 |
|
b6fa43518012523a52846232f03ceab2ccfa1fc2 | Recag/Rp_CommonC_728_2 | [
"region:us"
] | 2024-02-16T15:41:37+00:00 | {} | 2024-02-16T15:45:30+00:00 |
|
cb9cef1ac6bb51d1316a416bd056ffbae99fc2c4 | maghwa/OpenHermes-2-AR-10K-23-650k-660k | [
"region:us"
] | 2024-02-16T15:42:29+00:00 | {"dataset_info": {"features": [{"name": "skip_prompt_formatting", "dtype": "null"}, {"name": "model_name", "dtype": "null"}, {"name": "model", "dtype": "null"}, {"name": "conversations", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "id", "dtype": "null"}, {"name": "avatarUrl", "dtype": "null"}, {"name": "idx", "dtype": "null"}, {"name": "language", "dtype": "null"}, {"name": "hash", "dtype": "null"}, {"name": "views", "dtype": "float64"}, {"name": "topic", "dtype": "null"}, {"name": "title", "dtype": "null"}, {"name": "category", "dtype": "null"}, {"name": "custom_instruction", "dtype": "null"}, {"name": "system_prompt", "dtype": "null"}], "splits": [{"name": "train", "num_bytes": 25296599, "num_examples": 10001}], "download_size": 11468027, "dataset_size": 25296599}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T15:42:32+00:00 |
|
419db2e512cb2bd4c88afa18ec4a45ec1e6fa834 | # Dataset Card for "human_ref_dna"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | samchain/human_ref_dna | [
"region:us"
] | 2024-02-16T15:42:39+00:00 | {"dataset_info": {"features": [{"name": "sequence", "dtype": "string"}, {"name": "start_pos", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 4086227516, "num_examples": 249221}, {"name": "test", "num_bytes": 69420664, "num_examples": 4234}], "download_size": 1534074499, "dataset_size": 4155648180}} | 2024-02-16T15:44:47+00:00 |
7f0bf422f2c606bcf7add3b8713c21266793bb54 | Abdu07/unifi-value-frameworks-pdf-lifting-competition | [
"license:mit",
"region:us"
] | 2024-02-16T15:43:19+00:00 | {"license": "mit"} | 2024-02-16T15:47:00+00:00 |
|
679484742c99a54b6d5ea1b9f0fec45de26c8295 | Recag/Rp_CommonC_728_3 | [
"region:us"
] | 2024-02-16T15:44:17+00:00 | {} | 2024-02-16T15:49:03+00:00 |
|
737f984aad733be9aa303eb6f592b681d631f282 | Recag/Rp_CommonC_729_1 | [
"region:us"
] | 2024-02-16T15:48:58+00:00 | {} | 2024-02-16T15:57:38+00:00 |
|
0e533d0a9a8ab923d20771e0c16988953efd055c | Recag/Rp_CommonC_729_2 | [
"region:us"
] | 2024-02-16T15:49:19+00:00 | {} | 2024-02-16T15:57:39+00:00 |
|
5b83ea91f858fd364f542c1679f5b3a5265f241e | Recag/Rp_CommonC_729_3 | [
"region:us"
] | 2024-02-16T15:49:34+00:00 | {} | 2024-02-16T15:52:27+00:00 |
|
199fa7bd044ca68ab38bcf5479ae58d9782fef29 | adamxyang/1.4b-policy_preference_data_gold_labelled_noisy | [
"region:us"
] | 2024-02-16T15:53:58+00:00 | {"dataset_info": {"features": [{"name": "answers", "sequence": "string"}, {"name": "input", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "preference", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 27875579, "num_examples": 49383}, {"name": "validation", "num_bytes": 1139961, "num_examples": 2000}], "download_size": 15731882, "dataset_size": 29015540}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-02-16T15:54:36+00:00 |
|
1a07b27ead9575ce176e6bed917e79890a90408b | zhaospei/data-50k-gen-baseline | [
"region:us"
] | 2024-02-16T16:04:13+00:00 | {} | 2024-02-16T16:06:28+00:00 |
|
b335a83cbe2aa3948c3f446660cdc708a5766ef0 | LumenAI/donut_test_dataset | [
"region:us"
] | 2024-02-16T16:04:15+00:00 | {} | 2024-02-16T16:06:38+00:00 |
|
fc65c8d79e80c521d7e9317a374b4fb81ecd1ace | avr4/taxiDataset | [
"region:us"
] | 2024-02-16T16:08:13+00:00 | {} | 2024-02-16T16:10:20+00:00 |
|
ce063edc9303324533620403fdfa3183d0aed9b8 | zhaospei/data-50k-gen-finetune | [
"region:us"
] | 2024-02-16T16:08:28+00:00 | {} | 2024-02-17T16:23:01+00:00 |
|
6bf1ae7984df4a2088692843a95ee032b44364b0 | siddeo99/jobsaidataset | [
"license:apache-2.0",
"region:us"
] | 2024-02-16T16:10:13+00:00 | {"license": "apache-2.0"} | 2024-02-16T16:10:13+00:00 |
|
58b424f38b27d41710d17378526107de9acfa318 | FVilmar/conectado_dias | [
"license:openrail",
"region:us"
] | 2024-02-16T16:10:59+00:00 | {"license": "openrail"} | 2024-02-16T16:11:20+00:00 |
|
b544e809263882f38e9aaf4c6ae875678985a35f |
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-metamathqa-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-metamathqa-sft](https://huggingface.co/CorticalStack/mistral-7b-metamathqa-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T16:13:27.633644](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft/blob/main/results_2024-02-16T16-13-27.633644.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6109833546650115,
"acc_stderr": 0.033243551615710396,
"acc_norm": 0.6155806742206059,
"acc_norm_stderr": 0.03392554504142213,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882452,
"mc2": 0.44733823807616424,
"mc2_stderr": 0.014684832855657028
},
"harness|arc:challenge|25": {
"acc": 0.5546075085324232,
"acc_stderr": 0.014523987638344083,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.01440136664121639
},
"harness|hellaswag|10": {
"acc": 0.6065524795857399,
"acc_stderr": 0.0048751626991216535,
"acc_norm": 0.8044214299940251,
"acc_norm_stderr": 0.0039583479345203345
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.025560604721022884,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.025560604721022884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03053289223393202,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03053289223393202
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608302,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608302
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399324,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399324
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352167,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352167
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459752,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459752
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371165,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371165
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.015201032512520425,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.015201032512520425
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717156,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.01261820406658839,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.01261820406658839
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824873,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.019691459052354022,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.019691459052354022
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540603,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882452,
"mc2": 0.44733823807616424,
"mc2_stderr": 0.014684832855657028
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205191
},
"harness|gsm8k|5": {
"acc": 0.40181956027293403,
"acc_stderr": 0.013504357787494035
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft | [
"region:us"
] | 2024-02-16T16:15:47+00:00 | {"pretty_name": "Evaluation run of CorticalStack/mistral-7b-metamathqa-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-metamathqa-sft](https://huggingface.co/CorticalStack/mistral-7b-metamathqa-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T16:13:27.633644](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-metamathqa-sft/blob/main/results_2024-02-16T16-13-27.633644.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6109833546650115,\n \"acc_stderr\": 0.033243551615710396,\n \"acc_norm\": 0.6155806742206059,\n \"acc_norm_stderr\": 0.03392554504142213,\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882452,\n \"mc2\": 0.44733823807616424,\n \"mc2_stderr\": 0.014684832855657028\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5546075085324232,\n \"acc_stderr\": 0.014523987638344083,\n \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.01440136664121639\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6065524795857399,\n \"acc_stderr\": 0.0048751626991216535,\n \"acc_norm\": 0.8044214299940251,\n \"acc_norm_stderr\": 0.0039583479345203345\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n \"acc_stderr\": 0.025560604721022884,\n \"acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.025560604721022884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03053289223393202,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03053289223393202\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608302,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608302\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399324,\n \"acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399324\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352167,\n \"acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352167\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459752,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459752\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n \"acc_stderr\": 0.014805384478371165,\n \"acc_norm\": 0.7803320561941252,\n \"acc_norm_stderr\": 0.014805384478371165\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n \"acc_stderr\": 0.015201032512520425,\n \"acc_norm\": 0.2916201117318436,\n \"acc_norm_stderr\": 0.015201032512520425\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717156,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717156\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n \"acc_stderr\": 0.01261820406658839,\n \"acc_norm\": 0.4230769230769231,\n \"acc_norm_stderr\": 0.01261820406658839\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.019691459052354022,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.019691459052354022\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540603,\n \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882452,\n \"mc2\": 0.44733823807616424,\n \"mc2_stderr\": 0.014684832855657028\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205191\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40181956027293403,\n \"acc_stderr\": 0.013504357787494035\n }\n}\n```", "repo_url": "https://huggingface.co/CorticalStack/mistral-7b-metamathqa-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|arc:challenge|25_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|gsm8k|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hellaswag|10_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T16-13-27.633644.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["**/details_harness|winogrande|5_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T16-13-27.633644.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T16_13_27.633644", "path": ["results_2024-02-16T16-13-27.633644.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T16-13-27.633644.parquet"]}]}]} | 2024-02-16T16:16:11+00:00 |
3f8b69fa9581e19fbcf6db713d43936364f65f0a | varijashukla/picture | [
"region:us"
] | 2024-02-16T16:22:43+00:00 | {} | 2024-02-16T17:26:54+00:00 |
|
97beb2fa4a07c13458f014771a813fb3fdbd4d3a |
# PIPPA - Personal Interaction Pairs between People and AI in ShareGpt format
**⚠️ CAUTION: PIPPA contains conversations, themes and scenarios which can be considered "not safe for work" (NSFW) and/or heavily disturbing in nature. Models trained purely with PIPPA may have the tendency to generate X-rated output. You have been warned.**
## Dataset Summary
- `system`: Contains whatever was typed in the **Description** field in the character creator on the website. It usually consists of a few sentences which gives a brief overview of the character and any important details about them.
- `conversation`: The conversation between the user and the model. This is represented as a list of dictionaries, each dictionary representing a single utterance and containing two key-value pairs: `message`, referring to the utterance itself and `is_human`, which designates whether the dialogue was generated by the user or the LLM.
For further information about PIPPA, please refer to our [published paper](https://arxiv.org/abs/2308.05884) or contact us at the emails listed above.
## Files
I have used the original:
- **pippa_deduped.jsonl**: The 'cleaned' version of PIPPA, with duplicate conversations as well as any conversation with less than three turns removed from the dataset. **We recommend using this file.**
## Citation
Thanks to the https://huggingface.co/PygmalionAI Community, all the merits to them
___
Any relationship between the name of this dataset and any public personas is entirely and totally coincidential. | WasamiKirua/PygmalionPIPPA-ShareGpt | [
"task_categories:conversational",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"not-for-all-audiences",
"conversational",
"roleplay",
"custom-format",
"a.",
"arxiv:2308.05884",
"region:us"
] | 2024-02-16T16:32:19+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["conversational"], "pretty_name": "PIPPA - Personal Interaction Pairs Between People and AI", "tags": ["not-for-all-audiences", "conversational", "roleplay", "custom-format", "a."], "viewer": true} | 2024-02-16T16:50:13+00:00 |
46d51d99c2da866fd4b6eecffd60d99191896b06 | ksaito2omr/test_preupload | [
"region:us"
] | 2024-02-16T16:35:03+00:00 | {} | 2024-02-16T16:35:03+00:00 |
|
9c2505d819d307f1cea40b224ee713fbfe42cdf4 | misikoff/zillow | [
"license:mit",
"region:us"
] | 2024-02-16T16:36:27+00:00 | {"license": "mit"} | 2024-02-16T16:36:27+00:00 |
|
635ace5eefb13139c47386d9e44df21cfc3e3abf | DatadudeDev/Bible | [
"license:apache-2.0",
"region:us"
] | 2024-02-16T16:37:28+00:00 | {"license": "apache-2.0"} | 2024-02-16T16:38:28+00:00 |
|
451f3141a8e733e7c848ba668a8a21ef298bd08b | mikeion/mathconverse_pseudonyms | [
"region:us"
] | 2024-02-16T16:45:16+00:00 | {} | 2024-02-16T16:50:43+00:00 |
|
5a0f92c01106dc2cb837583a50a8bdb2c18a1d55 | siddeo99/sidtestfiverr | [
"region:us"
] | 2024-02-16T16:45:32+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 148650, "num_examples": 1051}], "download_size": 11709, "dataset_size": 148650}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T16:45:33+00:00 |
|
bc59e143e9758e76c05ef31df40b43bb4f812dc2 | Sugu-mar/Task_2Translate_english_to_thanglish | [
"region:us"
] | 2024-02-16T16:50:08+00:00 | {} | 2024-02-16T16:50:35+00:00 |
|
a8bcf3a8be24cf5ffb9ece921143fb4d11c89e71 | abideen/Copilot | [
"region:us"
] | 2024-02-16T16:53:28+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "metadata", "struct": [{"name": "file_path", "dtype": "string"}, {"name": "repo_id", "dtype": "string"}, {"name": "token_count", "dtype": "int64"}]}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 166992954.0, "num_examples": 22966}], "download_size": 61641266, "dataset_size": 166992954.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T16:53:33+00:00 |
|
a1bc634145c6c2fbc6b167266e4181bacd991156 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | vedastro-org/all-planet-data-london | [
"license:mit",
"region:us"
] | 2024-02-16T16:54:07+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "Name", "dtype": "string"}, {"name": "Time", "dtype": "string"}, {"name": "Location", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 99320, "num_examples": 1826}], "download_size": 23439, "dataset_size": 99320}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T17:28:41+00:00 |
Subsets and Splits