sha
stringlengths 40
40
| text
stringlengths 0
13.4M
| id
stringlengths 2
117
| tags
sequence | created_at
stringlengths 25
25
| metadata
stringlengths 2
31.7M
| last_modified
stringlengths 25
25
|
---|---|---|---|---|---|---|
070a4f2f7bc70d2b63f714f32e44bd2e687f279f | senhorsapo/kocho | [
"license:openrail",
"region:us"
] | 2024-01-14T18:46:23+00:00 | {"license": "openrail"} | 2024-01-14T18:46:46+00:00 |
|
efbaee1ec31f361ec40e7b75eb66ea70507ca9ce | DAVIX08BR/vo | [
"license:openrail",
"region:us"
] | 2024-01-14T18:47:06+00:00 | {"license": "openrail"} | 2024-01-17T14:51:53+00:00 |
|
81f1b9676fac70c62caaeeee825b0548396c3456 | FINNUMBER/ESG_Instruction | [
"region:us"
] | 2024-01-14T18:56:20+00:00 | {} | 2024-02-05T08:05:12+00:00 |
|
374f4e323cb607b15e34d783ab924c40de4e8748 | ascolda/ru_en_Crystallography_and_Spectroscopy | [
"task_categories:translation",
"size_categories:10K<n<100K",
"language:ru",
"language:en",
"chemistry",
"region:us"
] | 2024-01-14T19:02:40+00:00 | {"language": ["ru", "en"], "size_categories": ["10K<n<100K"], "task_categories": ["translation"], "tags": ["chemistry"]} | 2024-01-14T19:15:36+00:00 |
|
715ebd235aa83dca5e53c0d4f0b021b1316567eb | TheGreatP/HozierVoz | [
"license:openrail",
"region:us"
] | 2024-01-14T19:03:29+00:00 | {"license": "openrail"} | 2024-01-14T19:06:58+00:00 |
|
b0013ae1bafaa183513e49bafbc0d242acffb5dc | jianfuzhang233/controlnet_syncdreamer | [
"license:mit",
"region:us"
] | 2024-01-14T19:07:52+00:00 | {"license": "mit"} | 2024-01-29T02:35:00+00:00 |
|
654f4ce66f282ec9acbbe29c94e6cb4238e6f09a | GilsonRDF/Teste | [
"region:us"
] | 2024-01-14T19:10:23+00:00 | {"dataset_info": {"features": [{"name": "conversation", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5450.4, "num_examples": 24}, {"name": "test", "num_bytes": 1362.6, "num_examples": 6}], "download_size": 6033, "dataset_size": 6813.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-14T21:03:30+00:00 |
|
d5a0c7d41def084ffa9c6d4a7f48fe64626f4b25 | adambuttrick/500K-ner-indexes-multiple-organizations-locations-alpaca-format-json-response-all-cases | [
"license:cc0-1.0",
"region:us"
] | 2024-01-14T19:12:04+00:00 | {"license": "cc0-1.0"} | 2024-01-14T19:15:32+00:00 |
|
e1242d51159f03f66004e612145c395116e3e854 | yeager89/levi | [
"region:us"
] | 2024-01-14T19:12:29+00:00 | {} | 2024-01-15T01:16:45+00:00 |
|
d1f87e29d66a29edfa8f10af2dbb940b48028e00 | Minata/cot_mistral_method2test_v1 | [
"region:us"
] | 2024-01-14T19:12:35+00:00 | {"dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 155868, "num_examples": 93}], "download_size": 27946, "dataset_size": 155868}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T19:12:36+00:00 |
|
ff4150f5b27919f149c49ad2fb5133178647a7a8 |
# Dataset Card for Evaluation run of CallComply/openchat-3.5-0106-11b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CallComply/openchat-3.5-0106-11b](https://huggingface.co/CallComply/openchat-3.5-0106-11b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CallComply__openchat-3.5-0106-11b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T19:16:22.396289](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__openchat-3.5-0106-11b/blob/main/results_2024-01-14T19-16-22.396289.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6221695918215556,
"acc_stderr": 0.032672062972624025,
"acc_norm": 0.6283243003334837,
"acc_norm_stderr": 0.033341783944514224,
"mc1": 0.31946144430844553,
"mc1_stderr": 0.016322644182960498,
"mc2": 0.4806689432668841,
"mc2_stderr": 0.014999748207355675
},
"harness|arc:challenge|25": {
"acc": 0.5981228668941979,
"acc_stderr": 0.014327268614578276,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068285
},
"harness|hellaswag|10": {
"acc": 0.5804620593507269,
"acc_stderr": 0.00492474850063935,
"acc_norm": 0.7863971320454093,
"acc_norm_stderr": 0.004090119686697031
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337135,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337135
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.02557625706125383,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.02557625706125383
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881564,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881564
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.01665927970029582,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.01665927970029582
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066304,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436596,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436596
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26033519553072626,
"acc_stderr": 0.01467625200931947,
"acc_norm": 0.26033519553072626,
"acc_norm_stderr": 0.01467625200931947
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.02659678228769704,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.02659678228769704
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4589308996088657,
"acc_stderr": 0.012727084826799802,
"acc_norm": 0.4589308996088657,
"acc_norm_stderr": 0.012727084826799802
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573695,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573695
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.029504896454595957,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.029504896454595957
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31946144430844553,
"mc1_stderr": 0.016322644182960498,
"mc2": 0.4806689432668841,
"mc2_stderr": 0.014999748207355675
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.34495830174374525,
"acc_stderr": 0.01309363013366622
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CallComply__openchat-3.5-0106-11b | [
"region:us"
] | 2024-01-14T19:18:41+00:00 | {"pretty_name": "Evaluation run of CallComply/openchat-3.5-0106-11b", "dataset_summary": "Dataset automatically created during the evaluation run of model [CallComply/openchat-3.5-0106-11b](https://huggingface.co/CallComply/openchat-3.5-0106-11b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CallComply__openchat-3.5-0106-11b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T19:16:22.396289](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__openchat-3.5-0106-11b/blob/main/results_2024-01-14T19-16-22.396289.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6221695918215556,\n \"acc_stderr\": 0.032672062972624025,\n \"acc_norm\": 0.6283243003334837,\n \"acc_norm_stderr\": 0.033341783944514224,\n \"mc1\": 0.31946144430844553,\n \"mc1_stderr\": 0.016322644182960498,\n \"mc2\": 0.4806689432668841,\n \"mc2_stderr\": 0.014999748207355675\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5981228668941979,\n \"acc_stderr\": 0.014327268614578276,\n \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068285\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5804620593507269,\n \"acc_stderr\": 0.00492474850063935,\n \"acc_norm\": 0.7863971320454093,\n \"acc_norm_stderr\": 0.004090119686697031\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337135,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337135\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4417989417989418,\n \"acc_stderr\": 0.02557625706125383,\n \"acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.02557625706125383\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881564,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881564\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.01665927970029582,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.01665927970029582\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066304,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066304\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436596,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436596\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26033519553072626,\n \"acc_stderr\": 0.01467625200931947,\n \"acc_norm\": 0.26033519553072626,\n \"acc_norm_stderr\": 0.01467625200931947\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.02659678228769704,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.02659678228769704\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n \"acc_stderr\": 0.012727084826799802,\n \"acc_norm\": 0.4589308996088657,\n \"acc_norm_stderr\": 0.012727084826799802\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573695,\n \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573695\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595957,\n \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595957\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31946144430844553,\n \"mc1_stderr\": 0.016322644182960498,\n \"mc2\": 0.4806689432668841,\n \"mc2_stderr\": 0.014999748207355675\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.34495830174374525,\n \"acc_stderr\": 0.01309363013366622\n }\n}\n```", "repo_url": "https://huggingface.co/CallComply/openchat-3.5-0106-11b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|arc:challenge|25_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|gsm8k|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hellaswag|10_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T19-16-22.396289.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["**/details_harness|winogrande|5_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T19-16-22.396289.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T19_16_22.396289", "path": ["results_2024-01-14T19-16-22.396289.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T19-16-22.396289.parquet"]}]}]} | 2024-01-14T19:19:02+00:00 |
3ccc55a4c2278f7573dd79733945c6b49542ad5a | DucHaiten/sd1.5-journey | [
"region:us"
] | 2024-01-14T19:23:17+00:00 | {} | 2024-01-23T08:14:33+00:00 |
|
93c237afec08a4ef1e295f5089ed3ca0cf23376b |
# Dataset Card for Evaluation run of AA051611/A0113
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051611/A0113](https://huggingface.co/AA051611/A0113) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051611__A0113",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T19:22:00.115237](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__A0113/blob/main/results_2024-01-14T19-22-00.115237.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7396629430618338,
"acc_stderr": 0.02895723757690259,
"acc_norm": 0.7443509721070339,
"acc_norm_stderr": 0.02950325667268791,
"mc1": 0.412484700122399,
"mc1_stderr": 0.01723329939957122,
"mc2": 0.5965256915069256,
"mc2_stderr": 0.01518941143132932
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042194,
"acc_norm": 0.6638225255972696,
"acc_norm_stderr": 0.013804855026205761
},
"harness|hellaswag|10": {
"acc": 0.6549492133041227,
"acc_stderr": 0.00474413282539152,
"acc_norm": 0.848635729934276,
"acc_norm_stderr": 0.0035767110656195833
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.028631951845930387,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.028631951845930387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8,
"acc_stderr": 0.02461829819586651,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02461829819586651
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5,
"acc_stderr": 0.04975185951049946,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04975185951049946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7659574468085106,
"acc_stderr": 0.027678452578212383,
"acc_norm": 0.7659574468085106,
"acc_norm_stderr": 0.027678452578212383
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.03724563619774632,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.03724563619774632
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.867741935483871,
"acc_stderr": 0.019272015434846478,
"acc_norm": 0.867741935483871,
"acc_norm_stderr": 0.019272015434846478
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5812807881773399,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.5812807881773399,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284357,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.022390787638216773,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.022390787638216773
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9533678756476683,
"acc_stderr": 0.01521676181926258,
"acc_norm": 0.9533678756476683,
"acc_norm_stderr": 0.01521676181926258
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7923076923076923,
"acc_stderr": 0.020567539567246784,
"acc_norm": 0.7923076923076923,
"acc_norm_stderr": 0.020567539567246784
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4111111111111111,
"acc_stderr": 0.029999923508706682,
"acc_norm": 0.4111111111111111,
"acc_norm_stderr": 0.029999923508706682
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8613445378151261,
"acc_stderr": 0.022448264476832583,
"acc_norm": 0.8613445378151261,
"acc_norm_stderr": 0.022448264476832583
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.040261414976346104,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.040261414976346104
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9064220183486239,
"acc_stderr": 0.012486841824601963,
"acc_norm": 0.9064220183486239,
"acc_norm_stderr": 0.012486841824601963
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969426994,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969426994
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065515,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065515
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.027584066602208274,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.027584066602208274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597453,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597453
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002158,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002158
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.9074074074074074,
"acc_stderr": 0.02802188803860943,
"acc_norm": 0.9074074074074074,
"acc_norm_stderr": 0.02802188803860943
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339653,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339653
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.046161430750285455,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.046161430750285455
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331356,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625852,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625852
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9029374201787995,
"acc_stderr": 0.010586474712018302,
"acc_norm": 0.9029374201787995,
"acc_norm_stderr": 0.010586474712018302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7861271676300579,
"acc_stderr": 0.022075709251757177,
"acc_norm": 0.7861271676300579,
"acc_norm_stderr": 0.022075709251757177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6547486033519553,
"acc_stderr": 0.015901432608930358,
"acc_norm": 0.6547486033519553,
"acc_norm_stderr": 0.015901432608930358
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.022140767512880973,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.022140767512880973
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8135048231511254,
"acc_stderr": 0.0221224397724808,
"acc_norm": 0.8135048231511254,
"acc_norm_stderr": 0.0221224397724808
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.02118589361522515,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.02118589361522515
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5851063829787234,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.5851063829787234,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5619295958279009,
"acc_stderr": 0.012671902782567643,
"acc_norm": 0.5619295958279009,
"acc_norm_stderr": 0.012671902782567643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.023345163616544855,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.023345163616544855
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7761437908496732,
"acc_stderr": 0.016863008585416613,
"acc_norm": 0.7761437908496732,
"acc_norm_stderr": 0.016863008585416613
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8081632653061225,
"acc_stderr": 0.025206963154225402,
"acc_norm": 0.8081632653061225,
"acc_norm_stderr": 0.025206963154225402
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.020687186951534094,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.020687186951534094
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594173,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594173
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.412484700122399,
"mc1_stderr": 0.01723329939957122,
"mc2": 0.5965256915069256,
"mc2_stderr": 0.01518941143132932
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.01079646868806868
},
"harness|gsm8k|5": {
"acc": 0.6087945413191812,
"acc_stderr": 0.0134425024027943
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AA051611__A0113 | [
"region:us"
] | 2024-01-14T19:24:10+00:00 | {"pretty_name": "Evaluation run of AA051611/A0113", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051611/A0113](https://huggingface.co/AA051611/A0113) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__A0113\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T19:22:00.115237](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__A0113/blob/main/results_2024-01-14T19-22-00.115237.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7396629430618338,\n \"acc_stderr\": 0.02895723757690259,\n \"acc_norm\": 0.7443509721070339,\n \"acc_norm_stderr\": 0.02950325667268791,\n \"mc1\": 0.412484700122399,\n \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5965256915069256,\n \"mc2_stderr\": 0.01518941143132932\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042194,\n \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205761\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6549492133041227,\n \"acc_stderr\": 0.00474413282539152,\n \"acc_norm\": 0.848635729934276,\n \"acc_norm_stderr\": 0.0035767110656195833\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.028631951845930387,\n \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.028631951845930387\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02461829819586651,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02461829819586651\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04975185951049946,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04975185951049946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7659574468085106,\n \"acc_stderr\": 0.027678452578212383,\n \"acc_norm\": 0.7659574468085106,\n \"acc_norm_stderr\": 0.027678452578212383\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.03724563619774632,\n \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.03724563619774632\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.024677862841332783,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.024677862841332783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.867741935483871,\n \"acc_stderr\": 0.019272015434846478,\n \"acc_norm\": 0.867741935483871,\n \"acc_norm_stderr\": 0.019272015434846478\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5812807881773399,\n \"acc_stderr\": 0.03471192860518468,\n \"acc_norm\": 0.5812807881773399,\n \"acc_norm_stderr\": 0.03471192860518468\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284357,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284357\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.022390787638216773,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.022390787638216773\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9533678756476683,\n \"acc_stderr\": 0.01521676181926258,\n \"acc_norm\": 0.9533678756476683,\n \"acc_norm_stderr\": 0.01521676181926258\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.020567539567246784,\n \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.020567539567246784\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4111111111111111,\n \"acc_stderr\": 0.029999923508706682,\n \"acc_norm\": 0.4111111111111111,\n \"acc_norm_stderr\": 0.029999923508706682\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8613445378151261,\n \"acc_stderr\": 0.022448264476832583,\n \"acc_norm\": 0.8613445378151261,\n \"acc_norm_stderr\": 0.022448264476832583\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.41721854304635764,\n \"acc_stderr\": 0.040261414976346104,\n \"acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.040261414976346104\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9064220183486239,\n \"acc_stderr\": 0.012486841824601963,\n \"acc_norm\": 0.9064220183486239,\n \"acc_norm_stderr\": 0.012486841824601963\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03214952147802749,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03214952147802749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9068627450980392,\n \"acc_stderr\": 0.020397853969426994,\n \"acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969426994\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065515,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065515\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n \"acc_stderr\": 0.027584066602208274,\n \"acc_norm\": 0.7847533632286996,\n \"acc_norm_stderr\": 0.027584066602208274\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597453,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597453\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9074074074074074,\n \"acc_stderr\": 0.02802188803860943,\n \"acc_norm\": 0.9074074074074074,\n \"acc_norm_stderr\": 0.02802188803860943\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339653,\n \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339653\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.046161430750285455,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.046161430750285455\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331356,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331356\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n \"acc_stderr\": 0.018315891685625852,\n \"acc_norm\": 0.9145299145299145,\n \"acc_norm_stderr\": 0.018315891685625852\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9029374201787995,\n \"acc_stderr\": 0.010586474712018302,\n \"acc_norm\": 0.9029374201787995,\n \"acc_norm_stderr\": 0.010586474712018302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7861271676300579,\n \"acc_stderr\": 0.022075709251757177,\n \"acc_norm\": 0.7861271676300579,\n \"acc_norm_stderr\": 0.022075709251757177\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6547486033519553,\n \"acc_stderr\": 0.015901432608930358,\n \"acc_norm\": 0.6547486033519553,\n \"acc_norm_stderr\": 0.015901432608930358\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.022140767512880973,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.022140767512880973\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8135048231511254,\n \"acc_stderr\": 0.0221224397724808,\n \"acc_norm\": 0.8135048231511254,\n \"acc_norm_stderr\": 0.0221224397724808\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.02118589361522515,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.02118589361522515\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5851063829787234,\n \"acc_stderr\": 0.0293922365846125,\n \"acc_norm\": 0.5851063829787234,\n \"acc_norm_stderr\": 0.0293922365846125\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5619295958279009,\n \"acc_stderr\": 0.012671902782567643,\n \"acc_norm\": 0.5619295958279009,\n \"acc_norm_stderr\": 0.012671902782567643\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.023345163616544855,\n \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.023345163616544855\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7761437908496732,\n \"acc_stderr\": 0.016863008585416613,\n \"acc_norm\": 0.7761437908496732,\n \"acc_norm_stderr\": 0.016863008585416613\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.025206963154225402,\n \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.025206963154225402\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n \"acc_stderr\": 0.020687186951534094,\n \"acc_norm\": 0.9054726368159204,\n \"acc_norm_stderr\": 0.020687186951534094\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594173,\n \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594173\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.412484700122399,\n \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5965256915069256,\n \"mc2_stderr\": 0.01518941143132932\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.01079646868806868\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6087945413191812,\n \"acc_stderr\": 0.0134425024027943\n }\n}\n```", "repo_url": "https://huggingface.co/AA051611/A0113", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|arc:challenge|25_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|gsm8k|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hellaswag|10_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T19-22-00.115237.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["**/details_harness|winogrande|5_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T19-22-00.115237.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T19_22_00.115237", "path": ["results_2024-01-14T19-22-00.115237.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T19-22-00.115237.parquet"]}]}]} | 2024-01-14T19:24:32+00:00 |
1399054f245ffb967c40e9b469932f702503860b |
The text of all the articles from Logic Magazine issues 1-18.
**logic_raw.txt** - The articles are separated by three newlines. Each paragraph is on its own line.
**logic_passages.txt** - The articles, broken up into passages of between 300 to 2000 characters. Each passage is on its own line. | bentarnoff/logic_magazine_raw | [
"language:en",
"license:cc",
"magazine",
"region:us"
] | 2024-01-14T19:29:04+00:00 | {"language": ["en"], "license": "cc", "pretty_name": "Logic Magazine Article Text", "tags": ["magazine"]} | 2024-01-15T02:16:29+00:00 |
88d78dd044a265dd77130111289fb5555cc6f084 |
# Dataset Card for Evaluation run of CallComply/openchat-3.5-0106-128k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CallComply/openchat-3.5-0106-128k](https://huggingface.co/CallComply/openchat-3.5-0106-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CallComply__openchat-3.5-0106-128k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T19:33:38.391321](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__openchat-3.5-0106-128k/blob/main/results_2024-01-14T19-33-38.391321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5749023148549777,
"acc_stderr": 0.03362057109614855,
"acc_norm": 0.5803055801198537,
"acc_norm_stderr": 0.034322339538364395,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.46500466840014487,
"mc2_stderr": 0.014848695472788285
},
"harness|arc:challenge|25": {
"acc": 0.5827645051194539,
"acc_stderr": 0.014409825518403079,
"acc_norm": 0.6424914675767918,
"acc_norm_stderr": 0.014005494275916573
},
"harness|hellaswag|10": {
"acc": 0.5573590918143796,
"acc_stderr": 0.004956839256162732,
"acc_norm": 0.7730531766580363,
"acc_norm_stderr": 0.004180018992862959
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.029146904747798328,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.029146904747798328
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.025988500792411887,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.025988500792411887
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.0390369864774844,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.0390369864774844
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.03289477330098616,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.03289477330098616
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331796,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790236,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790236
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.033448873829978666,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.033448873829978666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.0341078533890472,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.0341078533890472
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.029571601065753378,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.029571601065753378
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.014897235229450708,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.014897235229450708
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531015,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531015
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369922,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369922
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.02758281141515962,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.02758281141515962
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804015,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804015
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.026406145973625676,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.026406145973625676
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236837,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236837
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3970013037809648,
"acc_stderr": 0.012496346982909556,
"acc_norm": 0.3970013037809648,
"acc_norm_stderr": 0.012496346982909556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5866013071895425,
"acc_stderr": 0.01992211568278668,
"acc_norm": 0.5866013071895425,
"acc_norm_stderr": 0.01992211568278668
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.03038726291954773,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.03038726291954773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.46500466840014487,
"mc2_stderr": 0.014848695472788285
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.0117056975652052
},
"harness|gsm8k|5": {
"acc": 0.3297952994692949,
"acc_stderr": 0.012949955030571147
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CallComply__openchat-3.5-0106-128k | [
"region:us"
] | 2024-01-14T19:30:22+00:00 | {"pretty_name": "Evaluation run of CallComply/openchat-3.5-0106-128k", "dataset_summary": "Dataset automatically created during the evaluation run of model [CallComply/openchat-3.5-0106-128k](https://huggingface.co/CallComply/openchat-3.5-0106-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CallComply__openchat-3.5-0106-128k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T19:33:38.391321](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__openchat-3.5-0106-128k/blob/main/results_2024-01-14T19-33-38.391321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5749023148549777,\n \"acc_stderr\": 0.03362057109614855,\n \"acc_norm\": 0.5803055801198537,\n \"acc_norm_stderr\": 0.034322339538364395,\n \"mc1\": 0.31334149326805383,\n \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.46500466840014487,\n \"mc2_stderr\": 0.014848695472788285\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.014409825518403079,\n \"acc_norm\": 0.6424914675767918,\n \"acc_norm_stderr\": 0.014005494275916573\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5573590918143796,\n \"acc_stderr\": 0.004956839256162732,\n \"acc_norm\": 0.7730531766580363,\n \"acc_norm_stderr\": 0.004180018992862959\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798328,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.029146904747798328\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.6319444444444444,\n \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n \"acc_stderr\": 0.025988500792411887,\n \"acc_norm\": 0.7032258064516129,\n \"acc_norm_stderr\": 0.025988500792411887\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.0390369864774844,\n \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.0390369864774844\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6919191919191919,\n \"acc_stderr\": 0.03289477330098616,\n \"acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.03289477330098616\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331796,\n \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.03221943636566196,\n \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.03221943636566196\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790236,\n \"acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790236\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.033448873829978666,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.033448873829978666\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.0341078533890472,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.0341078533890472\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753378,\n \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753378\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.03731133519673893,\n \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.03731133519673893\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n \"acc_stderr\": 0.014897235229450708,\n \"acc_norm\": 0.776500638569604,\n \"acc_norm_stderr\": 0.014897235229450708\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531015,\n \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531015\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n \"acc_stderr\": 0.014551553659369922,\n \"acc_norm\": 0.2536312849162011,\n \"acc_norm_stderr\": 0.014551553659369922\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.02758281141515962,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.02758281141515962\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n \"acc_stderr\": 0.027264297599804015,\n \"acc_norm\": 0.639871382636656,\n \"acc_norm_stderr\": 0.027264297599804015\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.026406145973625676,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.026406145973625676\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236837,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236837\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3970013037809648,\n \"acc_stderr\": 0.012496346982909556,\n \"acc_norm\": 0.3970013037809648,\n \"acc_norm_stderr\": 0.012496346982909556\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5866013071895425,\n \"acc_stderr\": 0.01992211568278668,\n \"acc_norm\": 0.5866013071895425,\n \"acc_norm_stderr\": 0.01992211568278668\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.03038726291954773,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.03038726291954773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.7611940298507462,\n \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.46500466840014487,\n \"mc2_stderr\": 0.014848695472788285\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.0117056975652052\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3297952994692949,\n \"acc_stderr\": 0.012949955030571147\n }\n}\n```", "repo_url": "https://huggingface.co/CallComply/openchat-3.5-0106-128k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|arc:challenge|25_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|arc:challenge|25_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|gsm8k|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|gsm8k|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hellaswag|10_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hellaswag|10_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T19-28-00.282158.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T19-33-38.391321.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["**/details_harness|winogrande|5_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["**/details_harness|winogrande|5_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T19-33-38.391321.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T19_28_00.282158", "path": ["results_2024-01-14T19-28-00.282158.parquet"]}, {"split": "2024_01_14T19_33_38.391321", "path": ["results_2024-01-14T19-33-38.391321.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T19-33-38.391321.parquet"]}]}]} | 2024-01-14T19:35:58+00:00 |
b02fd8ab55d73d3eb7fea2f0e47fb53566021453 | marmofayezi/M3CelebA-Test | [
"region:us"
] | 2024-01-14T19:31:14+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "image", "dtype": "image"}, {"name": "mask", "dtype": "image"}, {"name": "caption", "dtype": "string"}, {"name": "landmark", "dtype": "image"}, {"name": "caption_fre", "dtype": "string"}, {"name": "caption_deu", "dtype": "string"}, {"name": "caption_ita", "dtype": "string"}, {"name": "caption_spa", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1104063693.75, "num_examples": 2998}], "download_size": 725132925, "dataset_size": 1104063693.75}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-18T20:00:07+00:00 |
|
e660e13f4babccf7d10073dc4e78032cca69d3a8 | dawidkubicki/ner_crypto_news | [
"region:us"
] | 2024-01-14T19:43:15+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "tokens", "sequence": "string"}, {"name": "ner_tags", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 62388, "num_examples": 152}, {"name": "validation", "num_bytes": 13265, "num_examples": 32}, {"name": "test", "num_bytes": 14322, "num_examples": 34}], "download_size": 36662, "dataset_size": 89975}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-14T19:47:36+00:00 |
|
e2aa3f30d138a7891a55bc16fb25bf12ea0d2f7b |
# Dataset Card for Evaluation run of CallComply/zephyr-7b-beta-128k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CallComply/zephyr-7b-beta-128k](https://huggingface.co/CallComply/zephyr-7b-beta-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CallComply__zephyr-7b-beta-128k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T19:45:35.717294](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__zephyr-7b-beta-128k/blob/main/results_2024-01-14T19-45-35.717294.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5337384150834084,
"acc_stderr": 0.034377622578911936,
"acc_norm": 0.5411488270607204,
"acc_norm_stderr": 0.03515985681109475,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144915,
"mc2": 0.4609603387456776,
"mc2_stderr": 0.01568400425776764
},
"harness|arc:challenge|25": {
"acc": 0.5435153583617748,
"acc_stderr": 0.01455594976049644,
"acc_norm": 0.5827645051194539,
"acc_norm_stderr": 0.014409825518403084
},
"harness|hellaswag|10": {
"acc": 0.6016729735112527,
"acc_stderr": 0.004885529674958333,
"acc_norm": 0.8099980083648676,
"acc_norm_stderr": 0.003915007231962104
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.03043779434298305,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.03043779434298305
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958217,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958217
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.02375292871211213,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.02375292871211213
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.02704574657353433,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.02704574657353433
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03859268142070264,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03859268142070264
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756776,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756776
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178263,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178263
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.02506909438729653,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.02506909438729653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608456,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.542016806722689,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.542016806722689,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.034411900234824655,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.034411900234824655
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.0314506860074486,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.0314506860074486
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.03314190222110657,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.03314190222110657
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009154,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009154
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7266922094508301,
"acc_stderr": 0.01593668106262856,
"acc_norm": 0.7266922094508301,
"acc_norm_stderr": 0.01593668106262856
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5924855491329479,
"acc_stderr": 0.026454578146931494,
"acc_norm": 0.5924855491329479,
"acc_norm_stderr": 0.026454578146931494
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.01473692638376197,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.01473692638376197
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.028431095444176643,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.028431095444176643
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5617283950617284,
"acc_stderr": 0.02760791408740047,
"acc_norm": 0.5617283950617284,
"acc_norm_stderr": 0.02760791408740047
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3820078226857888,
"acc_stderr": 0.012409564470235562,
"acc_norm": 0.3820078226857888,
"acc_norm_stderr": 0.012409564470235562
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.03018753206032938,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.03018753206032938
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5506535947712419,
"acc_stderr": 0.02012376652802727,
"acc_norm": 0.5506535947712419,
"acc_norm_stderr": 0.02012376652802727
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235926,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235926
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.0344578996436275,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.0344578996436275
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144915,
"mc2": 0.4609603387456776,
"mc2_stderr": 0.01568400425776764
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394105
},
"harness|gsm8k|5": {
"acc": 0.13040181956027294,
"acc_stderr": 0.009275630324554092
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CallComply__zephyr-7b-beta-128k | [
"region:us"
] | 2024-01-14T19:47:57+00:00 | {"pretty_name": "Evaluation run of CallComply/zephyr-7b-beta-128k", "dataset_summary": "Dataset automatically created during the evaluation run of model [CallComply/zephyr-7b-beta-128k](https://huggingface.co/CallComply/zephyr-7b-beta-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CallComply__zephyr-7b-beta-128k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T19:45:35.717294](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__zephyr-7b-beta-128k/blob/main/results_2024-01-14T19-45-35.717294.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5337384150834084,\n \"acc_stderr\": 0.034377622578911936,\n \"acc_norm\": 0.5411488270607204,\n \"acc_norm_stderr\": 0.03515985681109475,\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.016185744355144915,\n \"mc2\": 0.4609603387456776,\n \"mc2_stderr\": 0.01568400425776764\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5435153583617748,\n \"acc_stderr\": 0.01455594976049644,\n \"acc_norm\": 0.5827645051194539,\n \"acc_norm_stderr\": 0.014409825518403084\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6016729735112527,\n \"acc_stderr\": 0.004885529674958333,\n \"acc_norm\": 0.8099980083648676,\n \"acc_norm_stderr\": 0.003915007231962104\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.03043779434298305,\n \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.03043779434298305\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958217,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958217\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.041307408795554966,\n \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.041307408795554966\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30687830687830686,\n \"acc_stderr\": 0.02375292871211213,\n \"acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.02375292871211213\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n \"acc_stderr\": 0.02704574657353433,\n \"acc_norm\": 0.6548387096774193,\n \"acc_norm_stderr\": 0.02704574657353433\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.03859268142070264,\n \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03859268142070264\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756776,\n \"acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756776\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178263,\n \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178263\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.02506909438729653,\n \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.02506909438729653\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608456,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608456\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.03236361111951941,\n \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.03236361111951941\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.034411900234824655,\n \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.034411900234824655\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6286919831223629,\n \"acc_stderr\": 0.0314506860074486,\n \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.0314506860074486\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n \"acc_stderr\": 0.03314190222110657,\n \"acc_norm\": 0.57847533632287,\n \"acc_norm_stderr\": 0.03314190222110657\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n \"acc_stderr\": 0.026853450377009154,\n \"acc_norm\": 0.7863247863247863,\n \"acc_norm_stderr\": 0.026853450377009154\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7266922094508301,\n \"acc_stderr\": 0.01593668106262856,\n \"acc_norm\": 0.7266922094508301,\n \"acc_norm_stderr\": 0.01593668106262856\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.026454578146931494,\n \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.026454578146931494\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n \"acc_stderr\": 0.01473692638376197,\n \"acc_norm\": 0.2636871508379888,\n \"acc_norm_stderr\": 0.01473692638376197\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.028431095444176643,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.028431095444176643\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5617283950617284,\n \"acc_stderr\": 0.02760791408740047,\n \"acc_norm\": 0.5617283950617284,\n \"acc_norm_stderr\": 0.02760791408740047\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3820078226857888,\n \"acc_stderr\": 0.012409564470235562,\n \"acc_norm\": 0.3820078226857888,\n \"acc_norm_stderr\": 0.012409564470235562\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.03018753206032938,\n \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.03018753206032938\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5506535947712419,\n \"acc_stderr\": 0.02012376652802727,\n \"acc_norm\": 0.5506535947712419,\n \"acc_norm_stderr\": 0.02012376652802727\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235926,\n \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235926\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n \"acc_stderr\": 0.0344578996436275,\n \"acc_norm\": 0.6119402985074627,\n \"acc_norm_stderr\": 0.0344578996436275\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.016185744355144915,\n \"mc2\": 0.4609603387456776,\n \"mc2_stderr\": 0.01568400425776764\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394105\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13040181956027294,\n \"acc_stderr\": 0.009275630324554092\n }\n}\n```", "repo_url": "https://huggingface.co/CallComply/zephyr-7b-beta-128k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|arc:challenge|25_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|gsm8k|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hellaswag|10_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T19-45-35.717294.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["**/details_harness|winogrande|5_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T19-45-35.717294.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T19_45_35.717294", "path": ["results_2024-01-14T19-45-35.717294.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T19-45-35.717294.parquet"]}]}]} | 2024-01-14T19:48:19+00:00 |
fca8c06cfcbac7bb917aa8872b82aa513b22ead0 | reidolichess15/louise | [
"region:us"
] | 2024-01-14T19:50:21+00:00 | {} | 2024-01-14T19:51:33+00:00 |
|
b25ad59ffbaac297db33a4029c79c9c33177291a | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-FIT-en-enr-enb | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T19:52:47+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:36:55+00:00 |
db20b69f991431d206c37a391464900406b8805f | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-en-enr-enb | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T19:58:28+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:37:21+00:00 |
c782d04050ba6885a9d1806192057e1cdecc9f80 |
# Dataset Card for Evaluation run of moreh/MoMo-70B-lora-1.8.5-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [moreh/MoMo-70B-lora-1.8.5-DPO](https://huggingface.co/moreh/MoMo-70B-lora-1.8.5-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_moreh__MoMo-70B-lora-1.8.5-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T20:00:36.558108](https://huggingface.co/datasets/open-llm-leaderboard/details_moreh__MoMo-70B-lora-1.8.5-DPO/blob/main/results_2024-01-14T20-00-36.558108.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7718244861304054,
"acc_stderr": 0.02796487785418919,
"acc_norm": 0.7749239423331258,
"acc_norm_stderr": 0.0285082622909065,
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6579360053724295,
"mc2_stderr": 0.014740925357615238
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205761,
"acc_norm": 0.6953924914675768,
"acc_norm_stderr": 0.013449522109932487
},
"harness|hellaswag|10": {
"acc": 0.6640111531567416,
"acc_stderr": 0.0047136966941316765,
"acc_norm": 0.8560047799243179,
"acc_norm_stderr": 0.00350367366880503
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474928,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8301886792452831,
"acc_stderr": 0.02310839379984133,
"acc_norm": 0.8301886792452831,
"acc_norm_stderr": 0.02310839379984133
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9236111111111112,
"acc_stderr": 0.02221220393834591,
"acc_norm": 0.9236111111111112,
"acc_norm_stderr": 0.02221220393834591
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7745664739884393,
"acc_stderr": 0.031862098516411454,
"acc_norm": 0.7745664739884393,
"acc_norm_stderr": 0.031862098516411454
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6228070175438597,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.6228070175438597,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8068965517241379,
"acc_stderr": 0.032894455221273995,
"acc_norm": 0.8068965517241379,
"acc_norm_stderr": 0.032894455221273995
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6825396825396826,
"acc_stderr": 0.023973861998992086,
"acc_norm": 0.6825396825396826,
"acc_norm_stderr": 0.023973861998992086
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8903225806451613,
"acc_stderr": 0.017776778700485173,
"acc_norm": 0.8903225806451613,
"acc_norm_stderr": 0.017776778700485173
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.645320197044335,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.645320197044335,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066584,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066584
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.017646526677233335,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.017646526677233335
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909046,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909046
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8153846153846154,
"acc_stderr": 0.019671632413100288,
"acc_norm": 0.8153846153846154,
"acc_norm_stderr": 0.019671632413100288
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03046462171889533,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03046462171889533
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.02327425589870794,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.02327425589870794
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5827814569536424,
"acc_stderr": 0.040261414976346104,
"acc_norm": 0.5827814569536424,
"acc_norm_stderr": 0.040261414976346104
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9229357798165138,
"acc_stderr": 0.011434381698911096,
"acc_norm": 0.9229357798165138,
"acc_norm_stderr": 0.011434381698911096
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.03114144782353605,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.03114144782353605
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.018889750550956715,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.018889750550956715
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407256,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407256
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8931297709923665,
"acc_stderr": 0.027096548624883733,
"acc_norm": 0.8931297709923665,
"acc_norm_stderr": 0.027096548624883733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622804,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622804
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.0334327006286962,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.0334327006286962
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331362,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331362
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.01604626163167314,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.01604626163167314
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9182630906768838,
"acc_stderr": 0.009796913952313168,
"acc_norm": 0.9182630906768838,
"acc_norm_stderr": 0.009796913952313168
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.838150289017341,
"acc_stderr": 0.019829299214925416,
"acc_norm": 0.838150289017341,
"acc_norm_stderr": 0.019829299214925416
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7016759776536313,
"acc_stderr": 0.01530184004512928,
"acc_norm": 0.7016759776536313,
"acc_norm_stderr": 0.01530184004512928
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.0211706230112135,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.0211706230112135
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8488745980707395,
"acc_stderr": 0.020342749744428634,
"acc_norm": 0.8488745980707395,
"acc_norm_stderr": 0.020342749744428634
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8765432098765432,
"acc_stderr": 0.018303868806891787,
"acc_norm": 0.8765432098765432,
"acc_norm_stderr": 0.018303868806891787
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6524822695035462,
"acc_stderr": 0.02840662780959095,
"acc_norm": 0.6524822695035462,
"acc_norm_stderr": 0.02840662780959095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6166883963494133,
"acc_stderr": 0.012417603662901188,
"acc_norm": 0.6166883963494133,
"acc_norm_stderr": 0.012417603662901188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.022368672562886747,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.022368672562886747
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273337,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273337
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.024127463462650153,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.024127463462650153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824667,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824667
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6579360053724295,
"mc2_stderr": 0.014740925357615238
},
"harness|winogrande|5": {
"acc": 0.8413575374901342,
"acc_stderr": 0.010267936243028228
},
"harness|gsm8k|5": {
"acc": 0.7429871114480667,
"acc_stderr": 0.01203678175742868
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_moreh__MoMo-70B-lora-1.8.5-DPO | [
"region:us"
] | 2024-01-14T20:02:44+00:00 | {"pretty_name": "Evaluation run of moreh/MoMo-70B-lora-1.8.5-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [moreh/MoMo-70B-lora-1.8.5-DPO](https://huggingface.co/moreh/MoMo-70B-lora-1.8.5-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_moreh__MoMo-70B-lora-1.8.5-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T20:00:36.558108](https://huggingface.co/datasets/open-llm-leaderboard/details_moreh__MoMo-70B-lora-1.8.5-DPO/blob/main/results_2024-01-14T20-00-36.558108.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7718244861304054,\n \"acc_stderr\": 0.02796487785418919,\n \"acc_norm\": 0.7749239423331258,\n \"acc_norm_stderr\": 0.0285082622909065,\n \"mc1\": 0.48959608323133413,\n \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6579360053724295,\n \"mc2_stderr\": 0.014740925357615238\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205761,\n \"acc_norm\": 0.6953924914675768,\n \"acc_norm_stderr\": 0.013449522109932487\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6640111531567416,\n \"acc_stderr\": 0.0047136966941316765,\n \"acc_norm\": 0.8560047799243179,\n \"acc_norm_stderr\": 0.00350367366880503\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474928,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8301886792452831,\n \"acc_stderr\": 0.02310839379984133,\n \"acc_norm\": 0.8301886792452831,\n \"acc_norm_stderr\": 0.02310839379984133\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9236111111111112,\n \"acc_stderr\": 0.02221220393834591,\n \"acc_norm\": 0.9236111111111112,\n \"acc_norm_stderr\": 0.02221220393834591\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7745664739884393,\n \"acc_stderr\": 0.031862098516411454,\n \"acc_norm\": 0.7745664739884393,\n \"acc_norm_stderr\": 0.031862098516411454\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387536,\n \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387536\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6228070175438597,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.6228070175438597,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8068965517241379,\n \"acc_stderr\": 0.032894455221273995,\n \"acc_norm\": 0.8068965517241379,\n \"acc_norm_stderr\": 0.032894455221273995\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6825396825396826,\n \"acc_stderr\": 0.023973861998992086,\n \"acc_norm\": 0.6825396825396826,\n \"acc_norm_stderr\": 0.023973861998992086\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8903225806451613,\n \"acc_stderr\": 0.017776778700485173,\n \"acc_norm\": 0.8903225806451613,\n \"acc_norm_stderr\": 0.017776778700485173\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.645320197044335,\n \"acc_stderr\": 0.0336612448905145,\n \"acc_norm\": 0.645320197044335,\n \"acc_norm_stderr\": 0.0336612448905145\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9343434343434344,\n \"acc_stderr\": 0.017646526677233335,\n \"acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.017646526677233335\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909046,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909046\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8153846153846154,\n \"acc_stderr\": 0.019671632413100288,\n \"acc_norm\": 0.8153846153846154,\n \"acc_norm_stderr\": 0.019671632413100288\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03046462171889533,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03046462171889533\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.02327425589870794,\n \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.02327425589870794\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5827814569536424,\n \"acc_stderr\": 0.040261414976346104,\n \"acc_norm\": 0.5827814569536424,\n \"acc_norm_stderr\": 0.040261414976346104\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9229357798165138,\n \"acc_stderr\": 0.011434381698911096,\n \"acc_norm\": 0.9229357798165138,\n \"acc_norm_stderr\": 0.011434381698911096\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.03114144782353605,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.03114144782353605\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.018889750550956715,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.018889750550956715\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n \"acc_stderr\": 0.026241132996407256,\n \"acc_norm\": 0.8116591928251121,\n \"acc_norm_stderr\": 0.026241132996407256\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8931297709923665,\n \"acc_stderr\": 0.027096548624883733,\n \"acc_norm\": 0.8931297709923665,\n \"acc_norm_stderr\": 0.027096548624883733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622804,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622804\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.0334327006286962,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.0334327006286962\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331362,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331362\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.01604626163167314,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.01604626163167314\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9182630906768838,\n \"acc_stderr\": 0.009796913952313168,\n \"acc_norm\": 0.9182630906768838,\n \"acc_norm_stderr\": 0.009796913952313168\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.838150289017341,\n \"acc_stderr\": 0.019829299214925416,\n \"acc_norm\": 0.838150289017341,\n \"acc_norm_stderr\": 0.019829299214925416\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7016759776536313,\n \"acc_stderr\": 0.01530184004512928,\n \"acc_norm\": 0.7016759776536313,\n \"acc_norm_stderr\": 0.01530184004512928\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.0211706230112135,\n \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.0211706230112135\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8488745980707395,\n \"acc_stderr\": 0.020342749744428634,\n \"acc_norm\": 0.8488745980707395,\n \"acc_norm_stderr\": 0.020342749744428634\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8765432098765432,\n \"acc_stderr\": 0.018303868806891787,\n \"acc_norm\": 0.8765432098765432,\n \"acc_norm_stderr\": 0.018303868806891787\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6524822695035462,\n \"acc_stderr\": 0.02840662780959095,\n \"acc_norm\": 0.6524822695035462,\n \"acc_norm_stderr\": 0.02840662780959095\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6166883963494133,\n \"acc_stderr\": 0.012417603662901188,\n \"acc_norm\": 0.6166883963494133,\n \"acc_norm_stderr\": 0.012417603662901188\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.022368672562886747,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.022368672562886747\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273337,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273337\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.024127463462650153,\n \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.024127463462650153\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824667,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824667\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48959608323133413,\n \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6579360053724295,\n \"mc2_stderr\": 0.014740925357615238\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.010267936243028228\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7429871114480667,\n \"acc_stderr\": 0.01203678175742868\n }\n}\n```", "repo_url": "https://huggingface.co/moreh/MoMo-70B-lora-1.8.5-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|arc:challenge|25_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|gsm8k|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hellaswag|10_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T20-00-36.558108.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["**/details_harness|winogrande|5_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T20-00-36.558108.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T20_00_36.558108", "path": ["results_2024-01-14T20-00-36.558108.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T20-00-36.558108.parquet"]}]}]} | 2024-01-14T20:03:05+00:00 |
abf56a6780eb8176b92522309fa7006416859519 | andersonbcdefg/MEDI-processed-no-instruct-dedup-taskfiltered | [
"region:us"
] | 2024-01-14T20:05:40+00:00 | {"dataset_info": {"features": [{"name": "pos", "dtype": "string"}, {"name": "task_name", "dtype": "string"}, {"name": "neg", "dtype": "string"}, {"name": "query", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 425167107.0593314, "num_examples": 337877}], "download_size": 321552494, "dataset_size": 425167107.0593314}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T20:09:59+00:00 |
|
c0bd86fd3eb056f93c91dbfc80ac0f94178ec4fd | modelloosrvcc/LuanGalinha | [
"license:openrail",
"region:us"
] | 2024-01-14T20:10:40+00:00 | {"license": "openrail"} | 2024-01-14T20:10:52+00:00 |
|
d61a05cd1ad7c1f14078dd4e7bcc93257747f4c4 | # Dataset Card for "DoctorKelp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | KeynesYouDigIt/DoctorKelp | [
"region:us"
] | 2024-01-14T20:18:05+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "test_satellite", "1": "train_kelp", "2": "train_satellite"}}}}], "splits": [{"name": "train", "num_bytes": 28827196275.44, "num_examples": 22540}, {"name": "test", "num_bytes": 3643649767.064, "num_examples": 2852}], "download_size": 18049706797, "dataset_size": 32470846042.503998}} | 2024-01-14T20:44:29+00:00 |
c147d179b4796e0f78993d80160b5195b6bcb035 |
# Dataset of mai/マイ/마이 (Touhou)
This is the dataset of mai/マイ/마이 (Touhou), containing 159 images and their tags.
The core tags of this character are `blue_hair, bow, blue_eyes, hair_bow, short_hair, wings, ribbon, angel_wings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 159 | 143.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mai_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 159 | 95.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mai_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 291 | 177.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mai_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 159 | 131.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mai_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 291 | 231.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mai_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mai_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, dress, smile, solo, purple_eyes |
| 1 | 6 |  |  |  |  |  | 1girl, dress, solo |
| 2 | 22 |  |  |  |  |  | 1girl, puffy_short_sleeves, white_wings, feathered_wings, solo, white_dress, white_bow, bangs, buttons, looking_at_viewer, closed_mouth, breasts, black_ribbon, smile, frilled_sleeves, black_sash, blush |
| 3 | 12 |  |  |  |  |  | 2girls, blonde_hair, dress, blush, hat, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | smile | solo | purple_eyes | puffy_short_sleeves | white_wings | feathered_wings | white_dress | white_bow | bangs | buttons | looking_at_viewer | closed_mouth | breasts | black_ribbon | frilled_sleeves | black_sash | blush | 2girls | blonde_hair | hat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:-------|:--------------|:----------------------|:--------------|:------------------|:--------------|:------------|:--------|:----------|:--------------------|:---------------|:----------|:---------------|:------------------|:-------------|:--------|:---------|:--------------|:------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | X | | | | | | | | | | | | | | | | | | |
| 2 | 22 |  |  |  |  |  | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | |
| 3 | 12 |  |  |  |  |  | | X | X | | | | | | | | | | | | | | | | X | X | X | X |
| CyberHarem/mai_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T20:19:33+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T21:00:36+00:00 |
5fbc342dfa2e798679c21d55e925af46cc12dc26 | GilsonRDF/ExercisesLlama | [
"region:us"
] | 2024-01-14T20:20:56+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2306.4, "num_examples": 24}, {"name": "test", "num_bytes": 576.6, "num_examples": 6}], "download_size": 4045, "dataset_size": 2883.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-14T23:47:43+00:00 |
|
e7e763873bebec76c2230dd03eeede6dabf81dbb | jilp00/youtoks-transcripts-Kanji-Learning | [
"region:us"
] | 2024-01-14T20:24:03+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 68997, "num_examples": 78}], "download_size": 41971, "dataset_size": 68997}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T20:24:06+00:00 |
|
bc95b15e7fbcc7b74bae067aa6cc3638e94207f0 | # Dataset Card for "distilabel-intel-orca-dpo-pairs-binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | floleuerer/distilabel-intel-orca-dpo-pairs-binarized | [
"region:us"
] | 2024-01-14T20:28:42+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 24252252.089665655, "num_examples": 5625}, {"name": "test", "num_bytes": 1280518.9103343466, "num_examples": 297}], "download_size": 13698335, "dataset_size": 25532771.0}} | 2024-01-14T20:31:36+00:00 |
3ae30ea5fc42b1069ea99237d031a9b0480f08ee | AlexDom/labeling_with_pretrained | [
"region:us"
] | 2024-01-14T20:28:47+00:00 | {} | 2024-01-14T20:28:48+00:00 |
|
1cf34e2e4f7e56d0fb8279d7feb6ba958e063b79 |
This dataset contains nearly 18,000 European Member of Parliament (meps) speeches beween 2019 and 2023.
The speeches are from Italian, German, French and Belgium meps.
All the speeches were gently scraped for the european parliament website using this code: https://github.com/misclassified/meps-text-mining | misclassified/meps_speeches | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T20:32:39+00:00 | {"license": "apache-2.0"} | 2024-01-14T20:45:53+00:00 |
881b64d8ff972c225ee6c1dcd8897f15b82db21e |
# Dataset Card for Evaluation run of Jaume/openchat-3.5-0106-mod-gpt5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Jaume/openchat-3.5-0106-mod-gpt5](https://huggingface.co/Jaume/openchat-3.5-0106-mod-gpt5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Jaume__openchat-3.5-0106-mod-gpt5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T21:01:35.974498](https://huggingface.co/datasets/open-llm-leaderboard/details_Jaume__openchat-3.5-0106-mod-gpt5/blob/main/results_2024-01-14T21-01-35.974498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6528578653707416,
"acc_stderr": 0.031849870154313474,
"acc_norm": 0.6535559561419437,
"acc_norm_stderr": 0.03250454817189663,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5189602568049447,
"mc2_stderr": 0.015303685990455876
},
"harness|arc:challenge|25": {
"acc": 0.621160409556314,
"acc_stderr": 0.014175915490000324,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.01383903976282017
},
"harness|hellaswag|10": {
"acc": 0.6338378809002191,
"acc_stderr": 0.0048076995399734075,
"acc_norm": 0.8293168691495718,
"acc_norm_stderr": 0.0037546293132751625
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908353,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908353
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291943,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291943
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741626,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741626
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123563,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123563
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.023683591837008557,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.023683591837008557
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4869621903520209,
"acc_stderr": 0.012765893883835332,
"acc_norm": 0.4869621903520209,
"acc_norm_stderr": 0.012765893883835332
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02679956202488766,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02679956202488766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5189602568049447,
"mc2_stderr": 0.015303685990455876
},
"harness|winogrande|5": {
"acc": 0.8176795580110497,
"acc_stderr": 0.010851565594267195
},
"harness|gsm8k|5": {
"acc": 0.6815769522365428,
"acc_stderr": 0.01283222572307541
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Jaume__openchat-3.5-0106-mod-gpt5 | [
"region:us"
] | 2024-01-14T20:43:38+00:00 | {"pretty_name": "Evaluation run of Jaume/openchat-3.5-0106-mod-gpt5", "dataset_summary": "Dataset automatically created during the evaluation run of model [Jaume/openchat-3.5-0106-mod-gpt5](https://huggingface.co/Jaume/openchat-3.5-0106-mod-gpt5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Jaume__openchat-3.5-0106-mod-gpt5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T21:01:35.974498](https://huggingface.co/datasets/open-llm-leaderboard/details_Jaume__openchat-3.5-0106-mod-gpt5/blob/main/results_2024-01-14T21-01-35.974498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6528578653707416,\n \"acc_stderr\": 0.031849870154313474,\n \"acc_norm\": 0.6535559561419437,\n \"acc_norm_stderr\": 0.03250454817189663,\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5189602568049447,\n \"mc2_stderr\": 0.015303685990455876\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.621160409556314,\n \"acc_stderr\": 0.014175915490000324,\n \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.01383903976282017\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6338378809002191,\n \"acc_stderr\": 0.0048076995399734075,\n \"acc_norm\": 0.8293168691495718,\n \"acc_norm_stderr\": 0.0037546293132751625\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908353,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908353\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.030360379710291943,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.030360379710291943\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n \"acc_stderr\": 0.013223928616741626,\n \"acc_norm\": 0.8365261813537676,\n \"acc_norm_stderr\": 0.013223928616741626\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123563,\n \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123563\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008557,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008557\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4869621903520209,\n \"acc_stderr\": 0.012765893883835332,\n \"acc_norm\": 0.4869621903520209,\n \"acc_norm_stderr\": 0.012765893883835332\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02679956202488766,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02679956202488766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5189602568049447,\n \"mc2_stderr\": 0.015303685990455876\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8176795580110497,\n \"acc_stderr\": 0.010851565594267195\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6815769522365428,\n \"acc_stderr\": 0.01283222572307541\n }\n}\n```", "repo_url": "https://huggingface.co/Jaume/openchat-3.5-0106-mod-gpt5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|arc:challenge|25_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|arc:challenge|25_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|gsm8k|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|gsm8k|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hellaswag|10_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hellaswag|10_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T20-41-18.617914.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T21-01-35.974498.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["**/details_harness|winogrande|5_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["**/details_harness|winogrande|5_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T21-01-35.974498.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T20_41_18.617914", "path": ["results_2024-01-14T20-41-18.617914.parquet"]}, {"split": "2024_01_14T21_01_35.974498", "path": ["results_2024-01-14T21-01-35.974498.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T21-01-35.974498.parquet"]}]}]} | 2024-01-14T21:04:15+00:00 |
007dba5ec7a9af6d8700eca0a80b5962f5dd86f2 | Nomadsb212/images | [
"license:mit",
"region:us"
] | 2024-01-14T20:54:30+00:00 | {"license": "mit"} | 2024-01-14T20:57:38+00:00 |
|
2dbec0754c34d492e4922c3e0f77cc277936becc | mlabonne/chessllm | [
"region:us"
] | 2024-01-14T20:58:49+00:00 | {} | 2024-01-14T22:02:46+00:00 |
|
58072351479e03a160e1e44a29ba5602ac5ac280 | rs0x29a/the-stack-yaml-camel-k | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T21:02:17+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "hexsha", "dtype": "string"}, {"name": "size", "dtype": "int64"}, {"name": "ext", "dtype": "string"}, {"name": "lang", "dtype": "string"}, {"name": "max_stars_repo_path", "dtype": "string"}, {"name": "max_stars_repo_name", "dtype": "string"}, {"name": "max_stars_repo_head_hexsha", "dtype": "string"}, {"name": "max_stars_repo_licenses", "sequence": "string"}, {"name": "max_stars_count", "dtype": "int64"}, {"name": "max_stars_repo_stars_event_min_datetime", "dtype": "string"}, {"name": "max_stars_repo_stars_event_max_datetime", "dtype": "string"}, {"name": "max_issues_repo_path", "dtype": "string"}, {"name": "max_issues_repo_name", "dtype": "string"}, {"name": "max_issues_repo_head_hexsha", "dtype": "string"}, {"name": "max_issues_repo_licenses", "sequence": "string"}, {"name": "max_issues_count", "dtype": "int64"}, {"name": "max_issues_repo_issues_event_min_datetime", "dtype": "string"}, {"name": "max_issues_repo_issues_event_max_datetime", "dtype": "string"}, {"name": "max_forks_repo_path", "dtype": "string"}, {"name": "max_forks_repo_name", "dtype": "string"}, {"name": "max_forks_repo_head_hexsha", "dtype": "string"}, {"name": "max_forks_repo_licenses", "sequence": "string"}, {"name": "max_forks_count", "dtype": "int64"}, {"name": "max_forks_repo_forks_event_min_datetime", "dtype": "string"}, {"name": "max_forks_repo_forks_event_max_datetime", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "avg_line_length", "dtype": "float64"}, {"name": "max_line_length", "dtype": "int64"}, {"name": "alphanum_fraction", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 297506.9341430791, "num_examples": 40}], "download_size": 66785, "dataset_size": 297506.9341430791}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T10:10:00+00:00 |
|
73581a917845ae353eca2e55ac445997242f286b | LeeHarrold/yocto-manual-completion | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T21:04:38+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 531238, "num_examples": 123}], "download_size": 263830, "dataset_size": 531238}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T21:08:44+00:00 |
|
e2bd101a21e9d47b01d171167b0e0b72b9207f36 | senhorsapo/charlie | [
"license:openrail",
"region:us"
] | 2024-01-14T21:07:19+00:00 | {"license": "openrail"} | 2024-01-14T21:07:19+00:00 |
|
21d286fdf12418c06fea8a9b31f513db3dac3215 | jilp00/youtoks-transcripts-Intro-Psychology | [
"region:us"
] | 2024-01-14T21:08:04+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1360676, "num_examples": 1583}], "download_size": 757845, "dataset_size": 1360676}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T21:08:05+00:00 |
|
2b8468580922d27395257e018a02bf16485e8cac | IsaacLabe/4D-Gaussian-Semantic-data | [
"region:us"
] | 2024-01-14T21:08:33+00:00 | {} | 2024-01-19T09:23:25+00:00 |
|
4f5a49be0bc9ea4dbfc48caf316c9d93a7d83eb4 | adambuttrick/100K-ner-indexes-multiple-organizations-locations-alpaca-format-json-response-all-cases | [
"license:cc0-1.0",
"region:us"
] | 2024-01-14T21:10:32+00:00 | {"license": "cc0-1.0"} | 2024-01-14T21:12:39+00:00 |
|
8515cfcc43a1e5242d97d0b006fa9d0a5f61ddd2 | # Dataset Card for "autotrain-data-autotrain-jose-antorcha-22"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | pedromigurasdev/autotrain-data-autotrain-jose-antorcha-22 | [
"region:us"
] | 2024-01-14T21:14:16+00:00 | {"dataset_info": {"features": [{"name": "autotrain_text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 555000, "num_examples": 840}, {"name": "validation", "num_bytes": 555000, "num_examples": 840}], "download_size": 84992, "dataset_size": 1110000}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T21:14:25+00:00 |
556e82854a945a113b665d0640868a71ee1240b8 | JackLilley/CMMC | [
"license:mit",
"region:us"
] | 2024-01-14T21:14:44+00:00 | {"license": "mit"} | 2024-01-14T21:18:06+00:00 |
|
051d37a3a27c981bb3a18ae37fe0ebcc20e8c48d |
# Dataset Card for Evaluation run of NovoCode/Novocode7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NovoCode/Novocode7b](https://huggingface.co/NovoCode/Novocode7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NovoCode__Novocode7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T01:09:59.087164](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Novocode7b/blob/main/results_2024-01-23T01-09-59.087164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5637380070206868,
"acc_stderr": 0.03397699301826096,
"acc_norm": 0.5694898071045811,
"acc_norm_stderr": 0.03471749621521052,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.6276801807189292,
"mc2_stderr": 0.015415755094430335
},
"harness|arc:challenge|25": {
"acc": 0.5477815699658704,
"acc_stderr": 0.01454451988063383,
"acc_norm": 0.5878839590443686,
"acc_norm_stderr": 0.014383915302225403
},
"harness|hellaswag|10": {
"acc": 0.6214897430790679,
"acc_stderr": 0.004840244782805302,
"acc_norm": 0.8051185022903804,
"acc_norm_stderr": 0.003952999181084448
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.02964781353936525,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.02964781353936525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.027218889773308753,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.027218889773308753
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438803,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438803
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.02951928261681723,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.02951928261681723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.541025641025641,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.541025641025641,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.018553897629501628,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.018553897629501628
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.03166009679399814,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.03166009679399814
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.030381931949990403,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.030381931949990403
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.0458790474130181,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.0458790474130181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395965,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395965
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6098265895953757,
"acc_stderr": 0.026261677607806642,
"acc_norm": 0.6098265895953757,
"acc_norm_stderr": 0.026261677607806642
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.016232826818678513,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.016232826818678513
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631462,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631462
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.02736807824397165,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.02736807824397165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.027339546640662737,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.027339546640662737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3859191655801825,
"acc_stderr": 0.012433398911476143,
"acc_norm": 0.3859191655801825,
"acc_norm_stderr": 0.012433398911476143
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5245098039215687,
"acc_stderr": 0.02020351728026144,
"acc_norm": 0.5245098039215687,
"acc_norm_stderr": 0.02020351728026144
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.03151236044674268,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.03151236044674268
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623326,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623326
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.6276801807189292,
"mc2_stderr": 0.015415755094430335
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.011616198215773218
},
"harness|gsm8k|5": {
"acc": 0.2304776345716452,
"acc_stderr": 0.011600249020595822
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NovoCode__Novocode7b | [
"region:us"
] | 2024-01-14T21:22:48+00:00 | {"pretty_name": "Evaluation run of NovoCode/Novocode7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [NovoCode/Novocode7b](https://huggingface.co/NovoCode/Novocode7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NovoCode__Novocode7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T01:09:59.087164](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Novocode7b/blob/main/results_2024-01-23T01-09-59.087164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5637380070206868,\n \"acc_stderr\": 0.03397699301826096,\n \"acc_norm\": 0.5694898071045811,\n \"acc_norm_stderr\": 0.03471749621521052,\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6276801807189292,\n \"mc2_stderr\": 0.015415755094430335\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.01454451988063383,\n \"acc_norm\": 0.5878839590443686,\n \"acc_norm_stderr\": 0.014383915302225403\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6214897430790679,\n \"acc_stderr\": 0.004840244782805302,\n \"acc_norm\": 0.8051185022903804,\n \"acc_norm_stderr\": 0.003952999181084448\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.02964781353936525,\n \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.02964781353936525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n \"acc_stderr\": 0.027218889773308753,\n \"acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.027218889773308753\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438803,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438803\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.02951928261681723,\n \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.02951928261681723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501628,\n \"acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501628\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.03166009679399814,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.03166009679399814\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.679324894514768,\n \"acc_stderr\": 0.030381931949990403,\n \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.030381931949990403\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.0458790474130181,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.0458790474130181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n \"acc_stderr\": 0.015464676163395965,\n \"acc_norm\": 0.7509578544061303,\n \"acc_norm_stderr\": 0.015464676163395965\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806642,\n \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806642\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n \"acc_stderr\": 0.016232826818678513,\n \"acc_norm\": 0.37988826815642457,\n \"acc_norm_stderr\": 0.016232826818678513\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631462,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631462\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n \"acc_stderr\": 0.02736807824397165,\n \"acc_norm\": 0.6334405144694534,\n \"acc_norm_stderr\": 0.02736807824397165\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.027339546640662737,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.027339546640662737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4078014184397163,\n \"acc_stderr\": 0.02931601177634356,\n \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.02931601177634356\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3859191655801825,\n \"acc_stderr\": 0.012433398911476143,\n \"acc_norm\": 0.3859191655801825,\n \"acc_norm_stderr\": 0.012433398911476143\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5245098039215687,\n \"acc_stderr\": 0.02020351728026144,\n \"acc_norm\": 0.5245098039215687,\n \"acc_norm_stderr\": 0.02020351728026144\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674268,\n \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674268\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623326,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623326\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6276801807189292,\n \"mc2_stderr\": 0.015415755094430335\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773218\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2304776345716452,\n \"acc_stderr\": 0.011600249020595822\n }\n}\n```", "repo_url": "https://huggingface.co/NovoCode/Novocode7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|arc:challenge|25_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|arc:challenge|25_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|arc:challenge|25_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|gsm8k|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|gsm8k|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|gsm8k|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hellaswag|10_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hellaswag|10_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hellaswag|10_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T21-20-28.943538.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T00-46-49.917108.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T01-09-59.087164.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["**/details_harness|winogrande|5_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["**/details_harness|winogrande|5_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["**/details_harness|winogrande|5_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T01-09-59.087164.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T21_20_28.943538", "path": ["results_2024-01-14T21-20-28.943538.parquet"]}, {"split": "2024_01_23T00_46_49.917108", "path": ["results_2024-01-23T00-46-49.917108.parquet"]}, {"split": "2024_01_23T01_09_59.087164", "path": ["results_2024-01-23T01-09-59.087164.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T01-09-59.087164.parquet"]}]}]} | 2024-01-23T01:12:22+00:00 |
9e78ee1fa8cd4681c6d4259e3b66ee3ec1cf4c2e |
# Dataset Card for Evaluation run of kz919/mistral-7b-sft-open-orca-flan-50k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kz919/mistral-7b-sft-open-orca-flan-50k](https://huggingface.co/kz919/mistral-7b-sft-open-orca-flan-50k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kz919__mistral-7b-sft-open-orca-flan-50k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T21:25:51.230819](https://huggingface.co/datasets/open-llm-leaderboard/details_kz919__mistral-7b-sft-open-orca-flan-50k/blob/main/results_2024-01-14T21-25-51.230819.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5538213786755696,
"acc_stderr": 0.03369594673096056,
"acc_norm": 0.5621293960309836,
"acc_norm_stderr": 0.03447812044023231,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.01522589934082683,
"mc2": 0.3749461951546611,
"mc2_stderr": 0.014143079789920542
},
"harness|arc:challenge|25": {
"acc": 0.5255972696245734,
"acc_stderr": 0.014592230885298964,
"acc_norm": 0.5878839590443686,
"acc_norm_stderr": 0.014383915302225403
},
"harness|hellaswag|10": {
"acc": 0.6160127464648476,
"acc_stderr": 0.004853608805843885,
"acc_norm": 0.8191595299741088,
"acc_norm_stderr": 0.0038409935166272657
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296564,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296564
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955784,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.03765746693865149,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.03765746693865149
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137595,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.02721888977330876,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.02721888977330876
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7512953367875648,
"acc_stderr": 0.03119584087770029,
"acc_norm": 0.7512953367875648,
"acc_norm_stderr": 0.03119584087770029
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.024939313906940794,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.024939313906940794
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.02488211685765508,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.02488211685765508
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7302752293577982,
"acc_stderr": 0.01902848671111544,
"acc_norm": 0.7302752293577982,
"acc_norm_stderr": 0.01902848671111544
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.032834720561085606,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.032834720561085606
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6582278481012658,
"acc_stderr": 0.030874537537553617,
"acc_norm": 0.6582278481012658,
"acc_norm_stderr": 0.030874537537553617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7637292464878672,
"acc_stderr": 0.015190473717037497,
"acc_norm": 0.7637292464878672,
"acc_norm_stderr": 0.015190473717037497
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531018,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531018
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.01473692638376196,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.01473692638376196
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.028245134024387292,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.028245134024387292
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.026730620728004903,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.026730620728004903
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573096,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573096
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3878748370273794,
"acc_stderr": 0.012444998309675609,
"acc_norm": 0.3878748370273794,
"acc_norm_stderr": 0.012444998309675609
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5898692810457516,
"acc_stderr": 0.019898412717635913,
"acc_norm": 0.5898692810457516,
"acc_norm_stderr": 0.019898412717635913
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.6,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.032744852119469564,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.032744852119469564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.01522589934082683,
"mc2": 0.3749461951546611,
"mc2_stderr": 0.014143079789920542
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089684
},
"harness|gsm8k|5": {
"acc": 0.10310841546626232,
"acc_stderr": 0.008376436987507795
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kz919__mistral-7b-sft-open-orca-flan-50k | [
"region:us"
] | 2024-01-14T21:28:10+00:00 | {"pretty_name": "Evaluation run of kz919/mistral-7b-sft-open-orca-flan-50k", "dataset_summary": "Dataset automatically created during the evaluation run of model [kz919/mistral-7b-sft-open-orca-flan-50k](https://huggingface.co/kz919/mistral-7b-sft-open-orca-flan-50k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kz919__mistral-7b-sft-open-orca-flan-50k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T21:25:51.230819](https://huggingface.co/datasets/open-llm-leaderboard/details_kz919__mistral-7b-sft-open-orca-flan-50k/blob/main/results_2024-01-14T21-25-51.230819.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5538213786755696,\n \"acc_stderr\": 0.03369594673096056,\n \"acc_norm\": 0.5621293960309836,\n \"acc_norm_stderr\": 0.03447812044023231,\n \"mc1\": 0.2533659730722154,\n \"mc1_stderr\": 0.01522589934082683,\n \"mc2\": 0.3749461951546611,\n \"mc2_stderr\": 0.014143079789920542\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5255972696245734,\n \"acc_stderr\": 0.014592230885298964,\n \"acc_norm\": 0.5878839590443686,\n \"acc_norm_stderr\": 0.014383915302225403\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6160127464648476,\n \"acc_stderr\": 0.004853608805843885,\n \"acc_norm\": 0.8191595299741088,\n \"acc_norm_stderr\": 0.0038409935166272657\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955784,\n \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955784\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.03765746693865149,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.03765746693865149\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137595,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137595\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n \"acc_stderr\": 0.02721888977330876,\n \"acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.02721888977330876\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270286,\n \"acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270286\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.03119584087770029,\n \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.03119584087770029\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940794,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940794\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.02488211685765508,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.02488211685765508\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7302752293577982,\n \"acc_stderr\": 0.01902848671111544,\n \"acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.01902848671111544\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.032834720561085606,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.032834720561085606\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6582278481012658,\n \"acc_stderr\": 0.030874537537553617,\n \"acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.030874537537553617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.7948717948717948,\n \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7637292464878672,\n \"acc_stderr\": 0.015190473717037497,\n \"acc_norm\": 0.7637292464878672,\n \"acc_norm_stderr\": 0.015190473717037497\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531018,\n \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531018\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n \"acc_stderr\": 0.01473692638376196,\n \"acc_norm\": 0.2636871508379888,\n \"acc_norm_stderr\": 0.01473692638376196\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.028245134024387292,\n \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.028245134024387292\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n \"acc_stderr\": 0.026730620728004903,\n \"acc_norm\": 0.6688102893890675,\n \"acc_norm_stderr\": 0.026730620728004903\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409825,\n \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409825\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573096,\n \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573096\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3878748370273794,\n \"acc_stderr\": 0.012444998309675609,\n \"acc_norm\": 0.3878748370273794,\n \"acc_norm_stderr\": 0.012444998309675609\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5898692810457516,\n \"acc_stderr\": 0.019898412717635913,\n \"acc_norm\": 0.5898692810457516,\n \"acc_norm_stderr\": 0.019898412717635913\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.031362502409358936,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.031362502409358936\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.032744852119469564,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.032744852119469564\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n \"mc1_stderr\": 0.01522589934082683,\n \"mc2\": 0.3749461951546611,\n \"mc2_stderr\": 0.014143079789920542\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089684\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10310841546626232,\n \"acc_stderr\": 0.008376436987507795\n }\n}\n```", "repo_url": "https://huggingface.co/kz919/mistral-7b-sft-open-orca-flan-50k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|arc:challenge|25_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|gsm8k|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hellaswag|10_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T21-25-51.230819.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["**/details_harness|winogrande|5_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T21-25-51.230819.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T21_25_51.230819", "path": ["results_2024-01-14T21-25-51.230819.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T21-25-51.230819.parquet"]}]}]} | 2024-01-14T21:28:31+00:00 |
98632809254d6f073203049e47e194057d0d0776 |
# Dataset Card for Evaluation run of bhavinjawade/SOLAR-10B-Nector-DPO-Jawade
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bhavinjawade/SOLAR-10B-Nector-DPO-Jawade](https://huggingface.co/bhavinjawade/SOLAR-10B-Nector-DPO-Jawade) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bhavinjawade__SOLAR-10B-Nector-DPO-Jawade",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T21:40:44.530689](https://huggingface.co/datasets/open-llm-leaderboard/details_bhavinjawade__SOLAR-10B-Nector-DPO-Jawade/blob/main/results_2024-01-14T21-40-44.530689.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6659513885128865,
"acc_stderr": 0.03153636640803569,
"acc_norm": 0.6668604037396749,
"acc_norm_stderr": 0.03217609086906697,
"mc1": 0.5618115055079559,
"mc1_stderr": 0.01736923616440442,
"mc2": 0.7092186670643685,
"mc2_stderr": 0.01520446597729704
},
"harness|arc:challenge|25": {
"acc": 0.6851535836177475,
"acc_stderr": 0.01357265770308495,
"acc_norm": 0.7133105802047781,
"acc_norm_stderr": 0.013214986329274779
},
"harness|hellaswag|10": {
"acc": 0.7124078868751245,
"acc_stderr": 0.0045171484341804905,
"acc_norm": 0.8861780521808404,
"acc_norm_stderr": 0.0031694581233577238
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.025722097064388535,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.025722097064388535
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.022828881775249377,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.022828881775249377
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3814814814814815,
"acc_stderr": 0.029616718927497593,
"acc_norm": 0.3814814814814815,
"acc_norm_stderr": 0.029616718927497593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342853,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342853
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568617,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597524,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597524
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217575,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217575
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36201117318435755,
"acc_stderr": 0.016073067350153087,
"acc_norm": 0.36201117318435755,
"acc_norm_stderr": 0.016073067350153087
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.024288619466046095,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.024288619466046095
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.02301670564026219,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.02301670564026219
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.524822695035461,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.524822695035461,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4915254237288136,
"acc_stderr": 0.01276840169726906,
"acc_norm": 0.4915254237288136,
"acc_norm_stderr": 0.01276840169726906
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.018635594034423983,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.018635594034423983
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5618115055079559,
"mc1_stderr": 0.01736923616440442,
"mc2": 0.7092186670643685,
"mc2_stderr": 0.01520446597729704
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370632
},
"harness|gsm8k|5": {
"acc": 0.6459438968915845,
"acc_stderr": 0.013172728385222567
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_bhavinjawade__SOLAR-10B-Nector-DPO-Jawade | [
"region:us"
] | 2024-01-14T21:43:01+00:00 | {"pretty_name": "Evaluation run of bhavinjawade/SOLAR-10B-Nector-DPO-Jawade", "dataset_summary": "Dataset automatically created during the evaluation run of model [bhavinjawade/SOLAR-10B-Nector-DPO-Jawade](https://huggingface.co/bhavinjawade/SOLAR-10B-Nector-DPO-Jawade) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bhavinjawade__SOLAR-10B-Nector-DPO-Jawade\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T21:40:44.530689](https://huggingface.co/datasets/open-llm-leaderboard/details_bhavinjawade__SOLAR-10B-Nector-DPO-Jawade/blob/main/results_2024-01-14T21-40-44.530689.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6659513885128865,\n \"acc_stderr\": 0.03153636640803569,\n \"acc_norm\": 0.6668604037396749,\n \"acc_norm_stderr\": 0.03217609086906697,\n \"mc1\": 0.5618115055079559,\n \"mc1_stderr\": 0.01736923616440442,\n \"mc2\": 0.7092186670643685,\n \"mc2_stderr\": 0.01520446597729704\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6851535836177475,\n \"acc_stderr\": 0.01357265770308495,\n \"acc_norm\": 0.7133105802047781,\n \"acc_norm_stderr\": 0.013214986329274779\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7124078868751245,\n \"acc_stderr\": 0.0045171484341804905,\n \"acc_norm\": 0.8861780521808404,\n \"acc_norm_stderr\": 0.0031694581233577238\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.025722097064388535,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.025722097064388535\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n \"acc_stderr\": 0.021886178567172534,\n \"acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.021886178567172534\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3814814814814815,\n \"acc_stderr\": 0.029616718927497593,\n \"acc_norm\": 0.3814814814814815,\n \"acc_norm_stderr\": 0.029616718927497593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342853,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342853\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568617,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568617\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597524,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597524\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n \"acc_stderr\": 0.014248873549217575,\n \"acc_norm\": 0.8020434227330779,\n \"acc_norm_stderr\": 0.014248873549217575\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36201117318435755,\n \"acc_stderr\": 0.016073067350153087,\n \"acc_norm\": 0.36201117318435755,\n \"acc_norm_stderr\": 0.016073067350153087\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046095,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046095\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.02301670564026219,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.02301670564026219\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.524822695035461,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.524822695035461,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4915254237288136,\n \"acc_stderr\": 0.01276840169726906,\n \"acc_norm\": 0.4915254237288136,\n \"acc_norm_stderr\": 0.01276840169726906\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.018635594034423983,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.018635594034423983\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5618115055079559,\n \"mc1_stderr\": 0.01736923616440442,\n \"mc2\": 0.7092186670643685,\n \"mc2_stderr\": 0.01520446597729704\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370632\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6459438968915845,\n \"acc_stderr\": 0.013172728385222567\n }\n}\n```", "repo_url": "https://huggingface.co/bhavinjawade/SOLAR-10B-Nector-DPO-Jawade", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|arc:challenge|25_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|gsm8k|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hellaswag|10_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T21-40-44.530689.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["**/details_harness|winogrande|5_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T21-40-44.530689.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T21_40_44.530689", "path": ["results_2024-01-14T21-40-44.530689.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T21-40-44.530689.parquet"]}]}]} | 2024-01-14T21:43:22+00:00 |
3624a06a494e69695437a43f7d3313a66976068e | anhnv125/code-small | [
"region:us"
] | 2024-01-14T21:44:30+00:00 | {"dataset_info": {"features": [{"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4768858, "num_examples": 2217}], "download_size": 2223998, "dataset_size": 4768858}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-17T09:35:12+00:00 |
|
503a0329f0545ca92e3b629eacad587d01ddcd91 | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-en-extra-3enr-1enb | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T21:46:17+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:37:42+00:00 |
526ae8624fe28cdd57032ca6bd7ca37e42e461df | Tsuinzues/gosma | [
"license:openrail",
"region:us"
] | 2024-01-14T21:47:28+00:00 | {"license": "openrail"} | 2024-01-14T21:47:41+00:00 |
|
25d605319f0c76580ae913d76d83b6e8b39c40fd |
THis repo contains about 100 rows of random speech to speech vox populi data. can be use for quick testing of code and pipelines | babs/vox-populi-subset | [
"region:us"
] | 2024-01-14T21:48:39+00:00 | {"dataset_info": {"features": [{"name": "source_id", "dtype": "string"}, {"name": "target_id", "dtype": "string"}, {"name": "source_audio", "dtype": "audio"}, {"name": "target_audio", "dtype": "audio"}, {"name": "target_units", "sequence": "int32"}], "splits": [{"name": "train", "num_bytes": 459597811.0, "num_examples": 1000}], "download_size": 457570458, "dataset_size": 459597811.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T19:44:53+00:00 |
4c30bb49eecfebc1110384125a3cc23268a417ea | ilostmygreggs/ai-voices | [
"region:us"
] | 2024-01-14T21:49:21+00:00 | {} | 2024-01-14T21:49:21+00:00 |
|
40ff2e5f9c768d72de7fee0bb924d0a3b52ec124 |
# Dataset Card for Evaluation run of h2m/mhm-7b-v1.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [h2m/mhm-7b-v1.3](https://huggingface.co/h2m/mhm-7b-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2m__mhm-7b-v1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T21:47:14.933980](https://huggingface.co/datasets/open-llm-leaderboard/details_h2m__mhm-7b-v1.3/blob/main/results_2024-01-14T21-47-14.933980.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.45565733826199045,
"acc_stderr": 0.034441057472680836,
"acc_norm": 0.46104688055946413,
"acc_norm_stderr": 0.03520094341367283,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502353,
"mc2": 0.4622053324775365,
"mc2_stderr": 0.015177238897436999
},
"harness|arc:challenge|25": {
"acc": 0.44197952218430037,
"acc_stderr": 0.014512682523128345,
"acc_norm": 0.47525597269624575,
"acc_norm_stderr": 0.014593487694937738
},
"harness|hellaswag|10": {
"acc": 0.4901414060944035,
"acc_stderr": 0.004988811384747417,
"acc_norm": 0.6530571599283012,
"acc_norm_stderr": 0.004750245757533308
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458006,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458006
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686781,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686781
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.35319148936170214,
"acc_stderr": 0.031245325202761926,
"acc_norm": 0.35319148936170214,
"acc_norm_stderr": 0.031245325202761926
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.02313528797432563,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.02313528797432563
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848877,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848877
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4935483870967742,
"acc_stderr": 0.028441638233540505,
"acc_norm": 0.4935483870967742,
"acc_norm_stderr": 0.028441638233540505
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.033764582465095665,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.033764582465095665
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.038154943086889305,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.038154943086889305
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5858585858585859,
"acc_stderr": 0.03509438348879629,
"acc_norm": 0.5858585858585859,
"acc_norm_stderr": 0.03509438348879629
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6062176165803109,
"acc_stderr": 0.035260770955482405,
"acc_norm": 0.6062176165803109,
"acc_norm_stderr": 0.035260770955482405
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4205128205128205,
"acc_stderr": 0.02502861027671086,
"acc_norm": 0.4205128205128205,
"acc_norm_stderr": 0.02502861027671086
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5944954128440367,
"acc_stderr": 0.021050997991896834,
"acc_norm": 0.5944954128440367,
"acc_norm_stderr": 0.021050997991896834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03460228327239171,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03460228327239171
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6371308016877637,
"acc_stderr": 0.03129920825530213,
"acc_norm": 0.6371308016877637,
"acc_norm_stderr": 0.03129920825530213
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.033516951676526276,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.033516951676526276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4723926380368098,
"acc_stderr": 0.0392237829061099,
"acc_norm": 0.4723926380368098,
"acc_norm_stderr": 0.0392237829061099
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.04750458399041696,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.04750458399041696
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.688034188034188,
"acc_stderr": 0.030351527323344934,
"acc_norm": 0.688034188034188,
"acc_norm_stderr": 0.030351527323344934
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6002554278416348,
"acc_stderr": 0.01751684790705328,
"acc_norm": 0.6002554278416348,
"acc_norm_stderr": 0.01751684790705328
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.026915047355369804,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.026915047355369804
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261433,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261433
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4673202614379085,
"acc_stderr": 0.02856869975222588,
"acc_norm": 0.4673202614379085,
"acc_norm_stderr": 0.02856869975222588
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.48231511254019294,
"acc_stderr": 0.02838032284907713,
"acc_norm": 0.48231511254019294,
"acc_norm_stderr": 0.02838032284907713
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759422,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759422
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34028683181225555,
"acc_stderr": 0.012101217610223784,
"acc_norm": 0.34028683181225555,
"acc_norm_stderr": 0.012101217610223784
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.030105636570016636,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.030105636570016636
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4084967320261438,
"acc_stderr": 0.01988622103750187,
"acc_norm": 0.4084967320261438,
"acc_norm_stderr": 0.01988622103750187
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5469387755102041,
"acc_stderr": 0.03186785930004128,
"acc_norm": 0.5469387755102041,
"acc_norm_stderr": 0.03186785930004128
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03333333333333334,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03333333333333334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5730994152046783,
"acc_stderr": 0.03793620616529917,
"acc_norm": 0.5730994152046783,
"acc_norm_stderr": 0.03793620616529917
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502353,
"mc2": 0.4622053324775365,
"mc2_stderr": 0.015177238897436999
},
"harness|winogrande|5": {
"acc": 0.6227308602999211,
"acc_stderr": 0.0136225679287995
},
"harness|gsm8k|5": {
"acc": 0.16679302501895377,
"acc_stderr": 0.010268516042629513
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_h2m__mhm-7b-v1.3 | [
"region:us"
] | 2024-01-14T21:49:34+00:00 | {"pretty_name": "Evaluation run of h2m/mhm-7b-v1.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [h2m/mhm-7b-v1.3](https://huggingface.co/h2m/mhm-7b-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2m__mhm-7b-v1.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T21:47:14.933980](https://huggingface.co/datasets/open-llm-leaderboard/details_h2m__mhm-7b-v1.3/blob/main/results_2024-01-14T21-47-14.933980.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45565733826199045,\n \"acc_stderr\": 0.034441057472680836,\n \"acc_norm\": 0.46104688055946413,\n \"acc_norm_stderr\": 0.03520094341367283,\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502353,\n \"mc2\": 0.4622053324775365,\n \"mc2_stderr\": 0.015177238897436999\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.44197952218430037,\n \"acc_stderr\": 0.014512682523128345,\n \"acc_norm\": 0.47525597269624575,\n \"acc_norm_stderr\": 0.014593487694937738\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4901414060944035,\n \"acc_stderr\": 0.004988811384747417,\n \"acc_norm\": 0.6530571599283012,\n \"acc_norm_stderr\": 0.004750245757533308\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458006,\n \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458006\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686781,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686781\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.45664739884393063,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.35319148936170214,\n \"acc_stderr\": 0.031245325202761926,\n \"acc_norm\": 0.35319148936170214,\n \"acc_norm_stderr\": 0.031245325202761926\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2804232804232804,\n \"acc_stderr\": 0.02313528797432563,\n \"acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.02313528797432563\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04006168083848877,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04006168083848877\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4935483870967742,\n \"acc_stderr\": 0.028441638233540505,\n \"acc_norm\": 0.4935483870967742,\n \"acc_norm_stderr\": 0.028441638233540505\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.038154943086889305,\n \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.038154943086889305\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5858585858585859,\n \"acc_stderr\": 0.03509438348879629,\n \"acc_norm\": 0.5858585858585859,\n \"acc_norm_stderr\": 0.03509438348879629\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6062176165803109,\n \"acc_stderr\": 0.035260770955482405,\n \"acc_norm\": 0.6062176165803109,\n \"acc_norm_stderr\": 0.035260770955482405\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4205128205128205,\n \"acc_stderr\": 0.02502861027671086,\n \"acc_norm\": 0.4205128205128205,\n \"acc_norm_stderr\": 0.02502861027671086\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.032104790510157764,\n \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.032104790510157764\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5944954128440367,\n \"acc_stderr\": 0.021050997991896834,\n \"acc_norm\": 0.5944954128440367,\n \"acc_norm_stderr\": 0.021050997991896834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.03460228327239171,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03460228327239171\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6371308016877637,\n \"acc_stderr\": 0.03129920825530213,\n \"acc_norm\": 0.6371308016877637,\n \"acc_norm_stderr\": 0.03129920825530213\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n \"acc_stderr\": 0.033516951676526276,\n \"acc_norm\": 0.5246636771300448,\n \"acc_norm_stderr\": 0.033516951676526276\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4723926380368098,\n \"acc_stderr\": 0.0392237829061099,\n \"acc_norm\": 0.4723926380368098,\n \"acc_norm_stderr\": 0.0392237829061099\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041696,\n \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041696\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n \"acc_stderr\": 0.030351527323344934,\n \"acc_norm\": 0.688034188034188,\n \"acc_norm_stderr\": 0.030351527323344934\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6002554278416348,\n \"acc_stderr\": 0.01751684790705328,\n \"acc_norm\": 0.6002554278416348,\n \"acc_norm_stderr\": 0.01751684790705328\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.026915047355369804,\n \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.026915047355369804\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n \"acc_stderr\": 0.014444157808261433,\n \"acc_norm\": 0.24804469273743016,\n \"acc_norm_stderr\": 0.014444157808261433\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4673202614379085,\n \"acc_stderr\": 0.02856869975222588,\n \"acc_norm\": 0.4673202614379085,\n \"acc_norm_stderr\": 0.02856869975222588\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.48231511254019294,\n \"acc_stderr\": 0.02838032284907713,\n \"acc_norm\": 0.48231511254019294,\n \"acc_norm_stderr\": 0.02838032284907713\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.027815973433878014,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.027815973433878014\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759422,\n \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759422\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34028683181225555,\n \"acc_stderr\": 0.012101217610223784,\n \"acc_norm\": 0.34028683181225555,\n \"acc_norm_stderr\": 0.012101217610223784\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016636,\n \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016636\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4084967320261438,\n \"acc_stderr\": 0.01988622103750187,\n \"acc_norm\": 0.4084967320261438,\n \"acc_norm_stderr\": 0.01988622103750187\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5469387755102041,\n \"acc_stderr\": 0.03186785930004128,\n \"acc_norm\": 0.5469387755102041,\n \"acc_norm_stderr\": 0.03186785930004128\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03333333333333334,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03333333333333334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5730994152046783,\n \"acc_stderr\": 0.03793620616529917,\n \"acc_norm\": 0.5730994152046783,\n \"acc_norm_stderr\": 0.03793620616529917\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502353,\n \"mc2\": 0.4622053324775365,\n \"mc2_stderr\": 0.015177238897436999\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6227308602999211,\n \"acc_stderr\": 0.0136225679287995\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16679302501895377,\n \"acc_stderr\": 0.010268516042629513\n }\n}\n```", "repo_url": "https://huggingface.co/h2m/mhm-7b-v1.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|arc:challenge|25_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|gsm8k|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hellaswag|10_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T21-47-14.933980.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["**/details_harness|winogrande|5_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T21-47-14.933980.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T21_47_14.933980", "path": ["results_2024-01-14T21-47-14.933980.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T21-47-14.933980.parquet"]}]}]} | 2024-01-14T21:49:54+00:00 |
d9cb7bf3db3f837856d2730cb1bd24fc4adda679 | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-pt | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T21:49:51+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:37:56+00:00 |
024f35781b98aca3a277e031952074be00072d55 | joaosanches/cleaned_tedtalks_total | [
"region:us"
] | 2024-01-14T21:55:03+00:00 | {"dataset_info": {"features": [{"name": "pt", "dtype": "string"}, {"name": "pt-br", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 66045930, "num_examples": 314702}], "download_size": 41381774, "dataset_size": 66045930}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-30T22:23:36+00:00 |
|
a7727d4aabe6ec9926da703f922c0e55546b5838 | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-es | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T21:56:38+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:38:11+00:00 |
c8c8f5d77f54414309c1d2e1ec17f59963d6c185 | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-fr | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T22:00:51+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:38:38+00:00 |
c14fdcd0e4725b7ab08d55f345df495d070a29d8 | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-en-pt | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T22:04:25+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:38:57+00:00 |
1c1eb8386fa19590a153a69c443ac3ab3b870524 | joaosanches/tedtalks_train_no_duplicates | [
"region:us"
] | 2024-01-14T22:05:23+00:00 | {"dataset_info": {"features": [{"name": "pt", "dtype": "string"}, {"name": "pt-br", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 26649615, "num_examples": 126984}], "download_size": 18481563, "dataset_size": 26649615}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-30T22:23:49+00:00 |
|
61f98144c1d0ef21c1424f80869cf7528f239a4f | fooperterooney/huh | [
"region:us"
] | 2024-01-14T22:07:43+00:00 | {"dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1718632.0, "num_examples": 3}], "download_size": 1706927, "dataset_size": 1718632.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T22:07:44+00:00 |
|
b1b693e12b8732688fde334dd05021fc7e05421b | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-en-pt-es-fr | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T22:10:51+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:43:13+00:00 |
0ab51c1dbcda1a84dba2bbdf2a5fee3d8c4b34fe |
# Dataset Card for Evaluation run of Locutusque/Rhino-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/Rhino-Mistral-7B](https://huggingface.co/Locutusque/Rhino-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__Rhino-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T22:10:37.195277](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Rhino-Mistral-7B/blob/main/results_2024-01-14T22-10-37.195277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48839477081957533,
"acc_stderr": 0.034627634904041645,
"acc_norm": 0.49321170014255594,
"acc_norm_stderr": 0.0353856151916697,
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219367,
"mc2": 0.4589835712394215,
"mc2_stderr": 0.014873298625532366
},
"harness|arc:challenge|25": {
"acc": 0.43430034129692835,
"acc_stderr": 0.014484703048857364,
"acc_norm": 0.4812286689419795,
"acc_norm_stderr": 0.014601090150633964
},
"harness|hellaswag|10": {
"acc": 0.5212109141605258,
"acc_stderr": 0.004985289555586536,
"acc_norm": 0.7142003584943238,
"acc_norm_stderr": 0.004508710891053852
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5283018867924528,
"acc_stderr": 0.0307235352490061,
"acc_norm": 0.5283018867924528,
"acc_norm_stderr": 0.0307235352490061
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364763,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992072,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.028229497320317216,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.028229497320317216
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.03464881675016338,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.03464881675016338
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414358,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414358
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4076923076923077,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.4076923076923077,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712156,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712156
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4789915966386555,
"acc_stderr": 0.032449808499900284,
"acc_norm": 0.4789915966386555,
"acc_norm_stderr": 0.032449808499900284
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.655045871559633,
"acc_stderr": 0.020380605405066962,
"acc_norm": 0.655045871559633,
"acc_norm_stderr": 0.020380605405066962
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.03480693138457039,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.03480693138457039
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6118143459915611,
"acc_stderr": 0.03172295004332329,
"acc_norm": 0.6118143459915611,
"acc_norm_stderr": 0.03172295004332329
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179662,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179662
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6709401709401709,
"acc_stderr": 0.030782321577688173,
"acc_norm": 0.6709401709401709,
"acc_norm_stderr": 0.030782321577688173
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6577266922094508,
"acc_stderr": 0.016967031766413624,
"acc_norm": 0.6577266922094508,
"acc_norm_stderr": 0.016967031766413624
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.02690290045866664,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.02690290045866664
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3206703910614525,
"acc_stderr": 0.015609929559348408,
"acc_norm": 0.3206703910614525,
"acc_norm_stderr": 0.015609929559348408
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.028580341065138286,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.028580341065138286
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5401929260450161,
"acc_stderr": 0.028306190403305693,
"acc_norm": 0.5401929260450161,
"acc_norm_stderr": 0.028306190403305693
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5308641975308642,
"acc_stderr": 0.02776768960683392,
"acc_norm": 0.5308641975308642,
"acc_norm_stderr": 0.02776768960683392
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.02847350127296376,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.02847350127296376
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36114732724902215,
"acc_stderr": 0.012267935477519028,
"acc_norm": 0.36114732724902215,
"acc_norm_stderr": 0.012267935477519028
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4632352941176471,
"acc_stderr": 0.030290619180485694,
"acc_norm": 0.4632352941176471,
"acc_norm_stderr": 0.030290619180485694
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4362745098039216,
"acc_stderr": 0.02006287424353913,
"acc_norm": 0.4362745098039216,
"acc_norm_stderr": 0.02006287424353913
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5510204081632653,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.5510204081632653,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.034288678487786564,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.034288678487786564
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219367,
"mc2": 0.4589835712394215,
"mc2_stderr": 0.014873298625532366
},
"harness|winogrande|5": {
"acc": 0.7111286503551697,
"acc_stderr": 0.012738241271018445
},
"harness|gsm8k|5": {
"acc": 0.221379833206975,
"acc_stderr": 0.011436000004253521
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Locutusque__Rhino-Mistral-7B | [
"region:us"
] | 2024-01-14T22:13:01+00:00 | {"pretty_name": "Evaluation run of Locutusque/Rhino-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Locutusque/Rhino-Mistral-7B](https://huggingface.co/Locutusque/Rhino-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__Rhino-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T22:10:37.195277](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Rhino-Mistral-7B/blob/main/results_2024-01-14T22-10-37.195277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48839477081957533,\n \"acc_stderr\": 0.034627634904041645,\n \"acc_norm\": 0.49321170014255594,\n \"acc_norm_stderr\": 0.0353856151916697,\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219367,\n \"mc2\": 0.4589835712394215,\n \"mc2_stderr\": 0.014873298625532366\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.43430034129692835,\n \"acc_stderr\": 0.014484703048857364,\n \"acc_norm\": 0.4812286689419795,\n \"acc_norm_stderr\": 0.014601090150633964\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5212109141605258,\n \"acc_stderr\": 0.004985289555586536,\n \"acc_norm\": 0.7142003584943238,\n \"acc_norm_stderr\": 0.004508710891053852\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874142,\n \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874142\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5283018867924528,\n \"acc_stderr\": 0.0307235352490061,\n \"acc_norm\": 0.5283018867924528,\n \"acc_norm_stderr\": 0.0307235352490061\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992072,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992072\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n \"acc_stderr\": 0.028229497320317216,\n \"acc_norm\": 0.5612903225806452,\n \"acc_norm_stderr\": 0.028229497320317216\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6161616161616161,\n \"acc_stderr\": 0.03464881675016338,\n \"acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.03464881675016338\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414358,\n \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414358\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4076923076923077,\n \"acc_stderr\": 0.024915243985987847,\n \"acc_norm\": 0.4076923076923077,\n \"acc_norm_stderr\": 0.024915243985987847\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712156,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712156\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4789915966386555,\n \"acc_stderr\": 0.032449808499900284,\n \"acc_norm\": 0.4789915966386555,\n \"acc_norm_stderr\": 0.032449808499900284\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.655045871559633,\n \"acc_stderr\": 0.020380605405066962,\n \"acc_norm\": 0.655045871559633,\n \"acc_norm_stderr\": 0.020380605405066962\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5637254901960784,\n \"acc_stderr\": 0.03480693138457039,\n \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.03480693138457039\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6118143459915611,\n \"acc_stderr\": 0.03172295004332329,\n \"acc_norm\": 0.6118143459915611,\n \"acc_norm_stderr\": 0.03172295004332329\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179662,\n \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179662\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6709401709401709,\n \"acc_stderr\": 0.030782321577688173,\n \"acc_norm\": 0.6709401709401709,\n \"acc_norm_stderr\": 0.030782321577688173\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6577266922094508,\n \"acc_stderr\": 0.016967031766413624,\n \"acc_norm\": 0.6577266922094508,\n \"acc_norm_stderr\": 0.016967031766413624\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.02690290045866664,\n \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.02690290045866664\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n \"acc_stderr\": 0.015609929559348408,\n \"acc_norm\": 0.3206703910614525,\n \"acc_norm_stderr\": 0.015609929559348408\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.028580341065138286,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.028580341065138286\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5401929260450161,\n \"acc_stderr\": 0.028306190403305693,\n \"acc_norm\": 0.5401929260450161,\n \"acc_norm_stderr\": 0.028306190403305693\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5308641975308642,\n \"acc_stderr\": 0.02776768960683392,\n \"acc_norm\": 0.5308641975308642,\n \"acc_norm_stderr\": 0.02776768960683392\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35106382978723405,\n \"acc_stderr\": 0.02847350127296376,\n \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.02847350127296376\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36114732724902215,\n \"acc_stderr\": 0.012267935477519028,\n \"acc_norm\": 0.36114732724902215,\n \"acc_norm_stderr\": 0.012267935477519028\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4632352941176471,\n \"acc_stderr\": 0.030290619180485694,\n \"acc_norm\": 0.4632352941176471,\n \"acc_norm_stderr\": 0.030290619180485694\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4362745098039216,\n \"acc_stderr\": 0.02006287424353913,\n \"acc_norm\": 0.4362745098039216,\n \"acc_norm_stderr\": 0.02006287424353913\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5510204081632653,\n \"acc_stderr\": 0.03184213866687579,\n \"acc_norm\": 0.5510204081632653,\n \"acc_norm_stderr\": 0.03184213866687579\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219367,\n \"mc2\": 0.4589835712394215,\n \"mc2_stderr\": 0.014873298625532366\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7111286503551697,\n \"acc_stderr\": 0.012738241271018445\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.221379833206975,\n \"acc_stderr\": 0.011436000004253521\n }\n}\n```", "repo_url": "https://huggingface.co/Locutusque/Rhino-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|arc:challenge|25_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|gsm8k|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hellaswag|10_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T22-10-37.195277.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["**/details_harness|winogrande|5_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T22-10-37.195277.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T22_10_37.195277", "path": ["results_2024-01-14T22-10-37.195277.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T22-10-37.195277.parquet"]}]}]} | 2024-01-14T22:13:22+00:00 |
27a559a4b17aa0f7752d010cef9bf61f1bb528c7 | SoorajK1/questions_and_answers | [
"region:us"
] | 2024-01-14T22:15:11+00:00 | {} | 2024-01-30T07:48:29+00:00 |
|
1d3690cd4ded8e9355e2e98ee9dd616ce3ef6792 | spedr/twt | [
"region:us"
] | 2024-01-14T22:21:37+00:00 | {} | 2024-01-15T05:45:44+00:00 |
|
c333c0646b7da531a9eed8f33fd059dce7ec7886 | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-en-pt-es-fr-enr-enb | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T22:22:27+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:39:30+00:00 |
a0d547c35251fd2b79a1da3f5b286d2f7f35f4b4 | VerminRed/Ngin | [
"license:openrail",
"region:us"
] | 2024-01-14T22:23:32+00:00 | {"license": "openrail"} | 2024-01-14T22:24:26+00:00 |
|
27d2c438a943e4786bc8ac67fa663708b60530c1 |
This is a subset (2000 samples) of [`timdettmers/openassistant-guanaco`](https://huggingface.co/datasets/timdettmers/openassistant-guanaco) dataset, processed to match Mistral-7B-instruct-v0.2's prompt format as described [in this article](https://huggingface.co/blog/llama2#how-to-prompt-llama-2). It was created using the [colab notebook](https://colab.research.google.com/drive/1afeicfJa9Mo8-wEcDoGrjyoVLyFkF9xm?usp=sharing).
Inspired by Maxime Labonne's [llm-course repo](https://github.com/mlabonne/llm-course).
| wenqiglantz/guanaco-llama2-2k | [
"region:us"
] | 2024-01-14T22:27:26+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3211457, "num_examples": 2000}], "download_size": 1887239, "dataset_size": 3211457}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-16T04:30:43+00:00 |
56bfd86ce9c804b0ca40b4d750d4c23225e2b6c3 | anmorgan24/pedro-pascal | [
"region:us"
] | 2024-01-14T22:30:33+00:00 | {} | 2024-01-14T22:33:59+00:00 |
|
d593cde04b493e89bab6a4fa30be891c05c41c0b | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-en-pt-es-fr-extra-3enr-3ptr-3esr-3frr | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T22:31:23+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:39:46+00:00 |
d4e3597a54259009bd3ba52c7cd135b1930c4fa0 | # Dataset Card for "autotrain-data-autotrain-tres"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | pedromigurasdev/autotrain-data-autotrain-tres | [
"region:us"
] | 2024-01-14T22:39:23+00:00 | {"dataset_info": {"features": [{"name": "autotrain_text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1145062, "num_examples": 758}, {"name": "validation", "num_bytes": 1145062, "num_examples": 758}], "download_size": 1344524, "dataset_size": 2290124}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T22:39:25+00:00 |
8e4a5ce0d07a6c5603c3a4181b5fe7e8388bc8eb |
# Dataset Card for Evaluation run of CallComply/SOLAR-10.7B-Instruct-v1.0-128k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CallComply/SOLAR-10.7B-Instruct-v1.0-128k](https://huggingface.co/CallComply/SOLAR-10.7B-Instruct-v1.0-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CallComply__SOLAR-10.7B-Instruct-v1.0-128k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T22:38:12.148949](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__SOLAR-10.7B-Instruct-v1.0-128k/blob/main/results_2024-01-14T22-38-12.148949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5736345987046274,
"acc_stderr": 0.033417579618165875,
"acc_norm": 0.5822139213719528,
"acc_norm_stderr": 0.03421698352385503,
"mc1": 0.48592411260709917,
"mc1_stderr": 0.017496563717042793,
"mc2": 0.6542262778057006,
"mc2_stderr": 0.015681013574816827
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759091,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892973
},
"harness|hellaswag|10": {
"acc": 0.6415056761601274,
"acc_stderr": 0.004785781979354868,
"acc_norm": 0.8434574785899224,
"acc_norm_stderr": 0.003626262805442223
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.02989060968628664,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.02989060968628664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099522,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099522
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.04489539350270699,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.04489539350270699
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819064,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819064
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.027327548447957543,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.027327548447957543
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806585,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806585
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915332,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915332
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.025049197876042338,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.025049197876042338
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945284,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.763302752293578,
"acc_stderr": 0.018224078117299106,
"acc_norm": 0.763302752293578,
"acc_norm_stderr": 0.018224078117299106
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990948,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990948
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7547892720306514,
"acc_stderr": 0.015384352284543932,
"acc_norm": 0.7547892720306514,
"acc_norm_stderr": 0.015384352284543932
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242836,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31620111731843575,
"acc_stderr": 0.015551673652172544,
"acc_norm": 0.31620111731843575,
"acc_norm_stderr": 0.015551673652172544
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.027634176689602653,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.027634176689602653
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.027731258647011994,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.027731258647011994
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.012620785155885992,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.012620785155885992
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.019780465954777518,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.019780465954777518
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913508,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4975124378109453,
"acc_stderr": 0.03535490150137288,
"acc_norm": 0.4975124378109453,
"acc_norm_stderr": 0.03535490150137288
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48592411260709917,
"mc1_stderr": 0.017496563717042793,
"mc2": 0.6542262778057006,
"mc2_stderr": 0.015681013574816827
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938256
},
"harness|gsm8k|5": {
"acc": 0.0712661106899166,
"acc_stderr": 0.0070864621279544985
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CallComply__SOLAR-10.7B-Instruct-v1.0-128k | [
"region:us"
] | 2024-01-14T22:40:30+00:00 | {"pretty_name": "Evaluation run of CallComply/SOLAR-10.7B-Instruct-v1.0-128k", "dataset_summary": "Dataset automatically created during the evaluation run of model [CallComply/SOLAR-10.7B-Instruct-v1.0-128k](https://huggingface.co/CallComply/SOLAR-10.7B-Instruct-v1.0-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CallComply__SOLAR-10.7B-Instruct-v1.0-128k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T22:38:12.148949](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__SOLAR-10.7B-Instruct-v1.0-128k/blob/main/results_2024-01-14T22-38-12.148949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5736345987046274,\n \"acc_stderr\": 0.033417579618165875,\n \"acc_norm\": 0.5822139213719528,\n \"acc_norm_stderr\": 0.03421698352385503,\n \"mc1\": 0.48592411260709917,\n \"mc1_stderr\": 0.017496563717042793,\n \"mc2\": 0.6542262778057006,\n \"mc2_stderr\": 0.015681013574816827\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759091,\n \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892973\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6415056761601274,\n \"acc_stderr\": 0.004785781979354868,\n \"acc_norm\": 0.8434574785899224,\n \"acc_norm_stderr\": 0.003626262805442223\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.02989060968628664,\n \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.02989060968628664\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099522,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099522\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.04489539350270699,\n \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.04489539350270699\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819064,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819064\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n \"acc_stderr\": 0.027327548447957543,\n \"acc_norm\": 0.6387096774193548,\n \"acc_norm_stderr\": 0.027327548447957543\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806585,\n \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806585\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915332,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915332\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.025049197876042338,\n \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.025049197876042338\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.763302752293578,\n \"acc_stderr\": 0.018224078117299106,\n \"acc_norm\": 0.763302752293578,\n \"acc_norm_stderr\": 0.018224078117299106\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990948,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990948\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7547892720306514,\n \"acc_stderr\": 0.015384352284543932,\n \"acc_norm\": 0.7547892720306514,\n \"acc_norm_stderr\": 0.015384352284543932\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242836,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242836\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n \"acc_stderr\": 0.015551673652172544,\n \"acc_norm\": 0.31620111731843575,\n \"acc_norm_stderr\": 0.015551673652172544\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602653,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602653\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n \"acc_stderr\": 0.027731258647011994,\n \"acc_norm\": 0.6077170418006431,\n \"acc_norm_stderr\": 0.027731258647011994\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n \"acc_stderr\": 0.012620785155885992,\n \"acc_norm\": 0.423728813559322,\n \"acc_norm_stderr\": 0.012620785155885992\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777518,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777518\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4975124378109453,\n \"acc_stderr\": 0.03535490150137288,\n \"acc_norm\": 0.4975124378109453,\n \"acc_norm_stderr\": 0.03535490150137288\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48592411260709917,\n \"mc1_stderr\": 0.017496563717042793,\n \"mc2\": 0.6542262778057006,\n \"mc2_stderr\": 0.015681013574816827\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938256\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \"acc_stderr\": 0.0070864621279544985\n }\n}\n```", "repo_url": "https://huggingface.co/CallComply/SOLAR-10.7B-Instruct-v1.0-128k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|arc:challenge|25_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|gsm8k|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hellaswag|10_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T22-38-12.148949.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["**/details_harness|winogrande|5_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T22-38-12.148949.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T22_38_12.148949", "path": ["results_2024-01-14T22-38-12.148949.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T22-38-12.148949.parquet"]}]}]} | 2024-01-14T22:40:49+00:00 |
f2db36ac37e9ac9db6d17a8bc7dffdf9a86fc230 | Foquss/jahprayzah | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T22:43:58+00:00 | {"license": "apache-2.0"} | 2024-01-14T22:43:58+00:00 |
|
86a00c83b88e66f867ab4b558450dd8860a6353a | samcp210/voices | [
"region:us"
] | 2024-01-14T22:44:45+00:00 | {} | 2024-01-14T22:44:45+00:00 |
|
890125115eb54752ed1ff6921273656f0a97c9ab | maxmyn/wholesome_greentext_180k | [
"region:us"
] | 2024-01-14T22:45:45+00:00 | {"dataset_info": {"features": [{"name": "greentexts", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 23675637, "num_examples": 179561}], "download_size": 14651344, "dataset_size": 23675637}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T22:45:51+00:00 |
|
5a106b6118a56ccd37d385ba6f3b9409ddb7adf2 | slasocrates/DulceMaria | [
"license:openrail",
"region:us"
] | 2024-01-14T22:46:45+00:00 | {"license": "openrail"} | 2024-01-14T22:49:11+00:00 |
|
73ba7886521de84706107e9a7226e13763211604 | nisancoskun/finnish_sentiment_data | [
"task_categories:text-classification",
"size_categories:10K<n<100K",
"source_datasets:sepidmnorozy/Finnish_sentiment",
"source_datasets:https://github.com/cynarr/sentiment-analysis",
"language:fi",
"license:mit",
"region:us"
] | 2024-01-14T22:49:49+00:00 | {"language": ["fi"], "license": "mit", "size_categories": ["10K<n<100K"], "source_datasets": ["sepidmnorozy/Finnish_sentiment", "https://github.com/cynarr/sentiment-analysis"], "task_categories": ["text-classification"]} | 2024-01-16T16:44:31+00:00 |
|
c632ea8e45be1687f511a2d02818538073bc985c | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-FIT-en | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T22:50:27+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:42:58+00:00 |
91dba49e507b26408827372c45e6899f7b4fb59a |
# Dataset Card for Evaluation run of CallComply/Starling-LM-11B-alpha
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CallComply/Starling-LM-11B-alpha](https://huggingface.co/CallComply/Starling-LM-11B-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CallComply__Starling-LM-11B-alpha",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T22:50:55.626486](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__Starling-LM-11B-alpha/blob/main/results_2024-01-14T22-50-55.626486.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6124497978149351,
"acc_stderr": 0.032857819921299845,
"acc_norm": 0.618390298674969,
"acc_norm_stderr": 0.03352975999467289,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4153002055665266,
"mc2_stderr": 0.014702058713161457
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230916,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.01423587248790987
},
"harness|hellaswag|10": {
"acc": 0.6105357498506274,
"acc_stderr": 0.0048663222583359665,
"acc_norm": 0.8198566022704641,
"acc_norm_stderr": 0.0038352143402103785
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031086,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031086
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.02432173848460235,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.02432173848460235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.01665927970029582,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.01665927970029582
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419996,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419996
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940876,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940876
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217582,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217582
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.01653117099327889,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.01653117099327889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364804,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364804
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4348109517601043,
"acc_stderr": 0.012661233805616295,
"acc_norm": 0.4348109517601043,
"acc_norm_stderr": 0.012661233805616295
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573705,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573705
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421606,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421606
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4153002055665266,
"mc2_stderr": 0.014702058713161457
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.011631268360607778
},
"harness|gsm8k|5": {
"acc": 0.35178165276724793,
"acc_stderr": 0.01315344602353602
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CallComply__Starling-LM-11B-alpha | [
"region:us"
] | 2024-01-14T22:53:11+00:00 | {"pretty_name": "Evaluation run of CallComply/Starling-LM-11B-alpha", "dataset_summary": "Dataset automatically created during the evaluation run of model [CallComply/Starling-LM-11B-alpha](https://huggingface.co/CallComply/Starling-LM-11B-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CallComply__Starling-LM-11B-alpha\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T22:50:55.626486](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__Starling-LM-11B-alpha/blob/main/results_2024-01-14T22-50-55.626486.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6124497978149351,\n \"acc_stderr\": 0.032857819921299845,\n \"acc_norm\": 0.618390298674969,\n \"acc_norm_stderr\": 0.03352975999467289,\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4153002055665266,\n \"mc2_stderr\": 0.014702058713161457\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230916,\n \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.01423587248790987\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6105357498506274,\n \"acc_stderr\": 0.0048663222583359665,\n \"acc_norm\": 0.8198566022704641,\n \"acc_norm_stderr\": 0.0038352143402103785\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.032685726586674915,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.032685726586674915\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944444,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031086,\n \"acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031086\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.01665927970029582,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.01665927970029582\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.033981108902946366,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.033981108902946366\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n \"acc_stderr\": 0.03219079200419996,\n \"acc_norm\": 0.6412556053811659,\n \"acc_norm_stderr\": 0.03219079200419996\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.038498560987940876,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940876\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n \"acc_stderr\": 0.014248873549217582,\n \"acc_norm\": 0.8020434227330779,\n \"acc_norm_stderr\": 0.014248873549217582\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n \"acc_stderr\": 0.01653117099327889,\n \"acc_norm\": 0.4245810055865922,\n \"acc_norm_stderr\": 0.01653117099327889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n \"acc_stderr\": 0.02698147804364804,\n \"acc_norm\": 0.6559485530546624,\n \"acc_norm_stderr\": 0.02698147804364804\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4348109517601043,\n \"acc_stderr\": 0.012661233805616295,\n \"acc_norm\": 0.4348109517601043,\n \"acc_norm_stderr\": 0.012661233805616295\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573705,\n \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573705\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421606,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421606\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4153002055665266,\n \"mc2_stderr\": 0.014702058713161457\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.011631268360607778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35178165276724793,\n \"acc_stderr\": 0.01315344602353602\n }\n}\n```", "repo_url": "https://huggingface.co/CallComply/Starling-LM-11B-alpha", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|arc:challenge|25_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|gsm8k|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hellaswag|10_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T22-50-55.626486.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["**/details_harness|winogrande|5_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T22-50-55.626486.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T22_50_55.626486", "path": ["results_2024-01-14T22-50-55.626486.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T22-50-55.626486.parquet"]}]}]} | 2024-01-14T22:53:32+00:00 |
37a21caed0c61f5e525d45fcb1e8577a83fafd00 | ibivibiv/plantuml-training | [
"region:us"
] | 2024-01-14T23:03:25+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1569689, "num_examples": 972}], "download_size": 681556, "dataset_size": 1569689}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T23:08:35+00:00 |
|
a76e08334dc92d55dfaeae38e73b81010ee78ae2 |
# Dataset Card for Evaluation run of cloudyu/Yi-34Bx3-MoE-90B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Yi-34Bx3-MoE-90B](https://huggingface.co/cloudyu/Yi-34Bx3-MoE-90B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Yi-34Bx3-MoE-90B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T23:01:35.520046](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Yi-34Bx3-MoE-90B/blob/main/results_2024-01-14T23-01-35.520046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.770922119161067,
"acc_stderr": 0.027863740601296195,
"acc_norm": 0.774340723628372,
"acc_norm_stderr": 0.02839947094621756,
"mc1": 0.49326805385556916,
"mc1_stderr": 0.017501914492655386,
"mc2": 0.6631117489702718,
"mc2_stderr": 0.01453284217897903
},
"harness|arc:challenge|25": {
"acc": 0.6723549488054608,
"acc_stderr": 0.01371584794071934,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.01327307786590759
},
"harness|hellaswag|10": {
"acc": 0.6586337382991436,
"acc_stderr": 0.004731989816563666,
"acc_norm": 0.8533160724955188,
"acc_norm_stderr": 0.003530675014892315
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9013157894736842,
"acc_stderr": 0.02427022773752271,
"acc_norm": 0.9013157894736842,
"acc_norm_stderr": 0.02427022773752271
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8037735849056604,
"acc_stderr": 0.024442388131100806,
"acc_norm": 0.8037735849056604,
"acc_norm_stderr": 0.024442388131100806
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.026280550932848087,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.026280550932848087
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8,
"acc_stderr": 0.0261488180184245,
"acc_norm": 0.8,
"acc_norm_stderr": 0.0261488180184245
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7793103448275862,
"acc_stderr": 0.03455930201924814,
"acc_norm": 0.7793103448275862,
"acc_norm_stderr": 0.03455930201924814
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7195767195767195,
"acc_stderr": 0.023135287974325618,
"acc_norm": 0.7195767195767195,
"acc_norm_stderr": 0.023135287974325618
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5873015873015873,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.5873015873015873,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9,
"acc_stderr": 0.017066403719657255,
"acc_norm": 0.9,
"acc_norm_stderr": 0.017066403719657255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969567,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.026024657651656187,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.026024657651656187
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9393939393939394,
"acc_stderr": 0.01699999492742161,
"acc_norm": 0.9393939393939394,
"acc_norm_stderr": 0.01699999492742161
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.823076923076923,
"acc_stderr": 0.01934807017439699,
"acc_norm": 0.823076923076923,
"acc_norm_stderr": 0.01934807017439699
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45555555555555555,
"acc_stderr": 0.03036486250482443,
"acc_norm": 0.45555555555555555,
"acc_norm_stderr": 0.03036486250482443
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.023005459446673957,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.023005459446673957
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.011800361363016581,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.011800361363016581
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6712962962962963,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.6712962962962963,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089674,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089674
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065515,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065515
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.027373095500540193,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.027373095500540193
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9007633587786259,
"acc_stderr": 0.02622223517147737,
"acc_norm": 0.9007633587786259,
"acc_norm_stderr": 0.02622223517147737
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540627,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540627
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.031457038543062504,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.031457038543062504
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8834355828220859,
"acc_stderr": 0.025212327210507094,
"acc_norm": 0.8834355828220859,
"acc_norm_stderr": 0.025212327210507094
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.912621359223301,
"acc_stderr": 0.027960689125970654,
"acc_norm": 0.912621359223301,
"acc_norm_stderr": 0.027960689125970654
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253876,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253876
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9067688378033205,
"acc_stderr": 0.010397417087292849,
"acc_norm": 0.9067688378033205,
"acc_norm_stderr": 0.010397417087292849
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8236994219653179,
"acc_stderr": 0.020516425672490714,
"acc_norm": 0.8236994219653179,
"acc_norm_stderr": 0.020516425672490714
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8011173184357542,
"acc_stderr": 0.013349892983092521,
"acc_norm": 0.8011173184357542,
"acc_norm_stderr": 0.013349892983092521
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8562091503267973,
"acc_stderr": 0.020091188936043693,
"acc_norm": 0.8562091503267973,
"acc_norm_stderr": 0.020091188936043693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8295819935691319,
"acc_stderr": 0.02135534302826405,
"acc_norm": 0.8295819935691319,
"acc_norm_stderr": 0.02135534302826405
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8858024691358025,
"acc_stderr": 0.017696832447213897,
"acc_norm": 0.8858024691358025,
"acc_norm_stderr": 0.017696832447213897
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.028663820147199485,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.028663820147199485
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6029986962190352,
"acc_stderr": 0.012496346982909554,
"acc_norm": 0.6029986962190352,
"acc_norm_stderr": 0.012496346982909554
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8419117647058824,
"acc_stderr": 0.02216146260806852,
"acc_norm": 0.8419117647058824,
"acc_norm_stderr": 0.02216146260806852
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8251633986928104,
"acc_stderr": 0.015366167064780644,
"acc_norm": 0.8251633986928104,
"acc_norm_stderr": 0.015366167064780644
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8408163265306122,
"acc_stderr": 0.02342097206916635,
"acc_norm": 0.8408163265306122,
"acc_norm_stderr": 0.02342097206916635
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.020687186951534108,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.020687186951534108
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587952,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587952
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9005847953216374,
"acc_stderr": 0.022949025579355044,
"acc_norm": 0.9005847953216374,
"acc_norm_stderr": 0.022949025579355044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49326805385556916,
"mc1_stderr": 0.017501914492655386,
"mc2": 0.6631117489702718,
"mc2_stderr": 0.01453284217897903
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598484
},
"harness|gsm8k|5": {
"acc": 0.7285822592873389,
"acc_stderr": 0.01224900202615058
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cloudyu__Yi-34Bx3-MoE-90B | [
"region:us"
] | 2024-01-14T23:03:51+00:00 | {"pretty_name": "Evaluation run of cloudyu/Yi-34Bx3-MoE-90B", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/Yi-34Bx3-MoE-90B](https://huggingface.co/cloudyu/Yi-34Bx3-MoE-90B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Yi-34Bx3-MoE-90B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T23:01:35.520046](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Yi-34Bx3-MoE-90B/blob/main/results_2024-01-14T23-01-35.520046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.770922119161067,\n \"acc_stderr\": 0.027863740601296195,\n \"acc_norm\": 0.774340723628372,\n \"acc_norm_stderr\": 0.02839947094621756,\n \"mc1\": 0.49326805385556916,\n \"mc1_stderr\": 0.017501914492655386,\n \"mc2\": 0.6631117489702718,\n \"mc2_stderr\": 0.01453284217897903\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6723549488054608,\n \"acc_stderr\": 0.01371584794071934,\n \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.01327307786590759\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6586337382991436,\n \"acc_stderr\": 0.004731989816563666,\n \"acc_norm\": 0.8533160724955188,\n \"acc_norm_stderr\": 0.003530675014892315\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.02427022773752271,\n \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.02427022773752271\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8037735849056604,\n \"acc_stderr\": 0.024442388131100806,\n \"acc_norm\": 0.8037735849056604,\n \"acc_norm_stderr\": 0.024442388131100806\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.026280550932848087,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.026280550932848087\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.0261488180184245,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.0261488180184245\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7793103448275862,\n \"acc_stderr\": 0.03455930201924814,\n \"acc_norm\": 0.7793103448275862,\n \"acc_norm_stderr\": 0.03455930201924814\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7195767195767195,\n \"acc_stderr\": 0.023135287974325618,\n \"acc_norm\": 0.7195767195767195,\n \"acc_norm_stderr\": 0.023135287974325618\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5873015873015873,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.5873015873015873,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.017066403719657255,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.017066403719657255\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969567,\n \"acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969567\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.026024657651656187,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.026024657651656187\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9393939393939394,\n \"acc_stderr\": 0.01699999492742161,\n \"acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.01699999492742161\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.823076923076923,\n \"acc_stderr\": 0.01934807017439699,\n \"acc_norm\": 0.823076923076923,\n \"acc_norm_stderr\": 0.01934807017439699\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45555555555555555,\n \"acc_stderr\": 0.03036486250482443,\n \"acc_norm\": 0.45555555555555555,\n \"acc_norm_stderr\": 0.03036486250482443\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.023005459446673957,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.023005459446673957\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248436,\n \"acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248436\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9174311926605505,\n \"acc_stderr\": 0.011800361363016581,\n \"acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.011800361363016581\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6712962962962963,\n \"acc_stderr\": 0.032036140846700596,\n \"acc_norm\": 0.6712962962962963,\n \"acc_norm_stderr\": 0.032036140846700596\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089674,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089674\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065515,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065515\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n \"acc_stderr\": 0.027373095500540193,\n \"acc_norm\": 0.7892376681614349,\n \"acc_norm_stderr\": 0.027373095500540193\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9007633587786259,\n \"acc_stderr\": 0.02622223517147737,\n \"acc_norm\": 0.9007633587786259,\n \"acc_norm_stderr\": 0.02622223517147737\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540627,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540627\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.031457038543062504,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.031457038543062504\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8834355828220859,\n \"acc_stderr\": 0.025212327210507094,\n \"acc_norm\": 0.8834355828220859,\n \"acc_norm_stderr\": 0.025212327210507094\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.6339285714285714,\n \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.912621359223301,\n \"acc_stderr\": 0.027960689125970654,\n \"acc_norm\": 0.912621359223301,\n \"acc_norm_stderr\": 0.027960689125970654\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253876,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253876\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9067688378033205,\n \"acc_stderr\": 0.010397417087292849,\n \"acc_norm\": 0.9067688378033205,\n \"acc_norm_stderr\": 0.010397417087292849\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8236994219653179,\n \"acc_stderr\": 0.020516425672490714,\n \"acc_norm\": 0.8236994219653179,\n \"acc_norm_stderr\": 0.020516425672490714\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8011173184357542,\n \"acc_stderr\": 0.013349892983092521,\n \"acc_norm\": 0.8011173184357542,\n \"acc_norm_stderr\": 0.013349892983092521\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.020091188936043693,\n \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.020091188936043693\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8295819935691319,\n \"acc_stderr\": 0.02135534302826405,\n \"acc_norm\": 0.8295819935691319,\n \"acc_norm_stderr\": 0.02135534302826405\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8858024691358025,\n \"acc_stderr\": 0.017696832447213897,\n \"acc_norm\": 0.8858024691358025,\n \"acc_norm_stderr\": 0.017696832447213897\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.028663820147199485,\n \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.028663820147199485\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6029986962190352,\n \"acc_stderr\": 0.012496346982909554,\n \"acc_norm\": 0.6029986962190352,\n \"acc_norm_stderr\": 0.012496346982909554\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8419117647058824,\n \"acc_stderr\": 0.02216146260806852,\n \"acc_norm\": 0.8419117647058824,\n \"acc_norm_stderr\": 0.02216146260806852\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8251633986928104,\n \"acc_stderr\": 0.015366167064780644,\n \"acc_norm\": 0.8251633986928104,\n \"acc_norm_stderr\": 0.015366167064780644\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.02342097206916635,\n \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.02342097206916635\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n \"acc_stderr\": 0.020687186951534108,\n \"acc_norm\": 0.9054726368159204,\n \"acc_norm_stderr\": 0.020687186951534108\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587952,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587952\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.022949025579355044,\n \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.022949025579355044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49326805385556916,\n \"mc1_stderr\": 0.017501914492655386,\n \"mc2\": 0.6631117489702718,\n \"mc2_stderr\": 0.01453284217897903\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598484\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7285822592873389,\n \"acc_stderr\": 0.01224900202615058\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/Yi-34Bx3-MoE-90B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|arc:challenge|25_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|gsm8k|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hellaswag|10_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T23-01-35.520046.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["**/details_harness|winogrande|5_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T23-01-35.520046.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T23_01_35.520046", "path": ["results_2024-01-14T23-01-35.520046.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T23-01-35.520046.parquet"]}]}]} | 2024-01-14T23:04:13+00:00 |
a8e719bd0bb9c28f5718f98a536b9e412f2d07ae |
# open-english-wordnet-synset-2023
Open English WordNet (2023)
## Dataset Details
### Dataset Description
Open English WordNet is a lexical network of the English language grouping words into synsets and linking them according to relationships such as hypernymy, antonymy and meronymy. It is intended to be used in natural language processing applications and provides deep lexical information about the English language as a graph.
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://github.com/globalwordnet/english-wordnet
- **Paper:** John P. McCrae, Alexandre Rademaker, Francis Bond, Ewa Rudnicka and Christiane Fellbaum (2019) [English WordNet 2019 – An Open-Source WordNet for English](https://aclanthology.org/2019.gwc-1.31/). In Proceedings of the 10th Global WordNet Conference – GWC 2019, Wrocław
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
```bibtex
@inproceedings{mccrae-etal-2019-english,
title = "{E}nglish {W}ord{N}et 2019 {--} An Open-Source {W}ord{N}et for {E}nglish",
author = "McCrae, John P. and
Rademaker, Alexandre and
Bond, Francis and
Rudnicka, Ewa and
Fellbaum, Christiane",
editor = "Vossen, Piek and
Fellbaum, Christiane",
booktitle = "Proceedings of the 10th Global Wordnet Conference",
month = jul,
year = "2019",
address = "Wroclaw, Poland",
publisher = "Global Wordnet Association",
url = "https://aclanthology.org/2019.gwc-1.31",
pages = "245--252",
abstract = "We describe the release of a new wordnet for English based on the Princeton WordNet, but now developed under an open-source model. In particular, this version of WordNet, which we call English WordNet 2019, which has been developed by multiple people around the world through GitHub, fixes many errors in previous wordnets for English. We give some details of the changes that have been made in this version and give some perspectives about likely future changes that will be made as this project continues to evolve.",
}
``` | jon-tow/open-english-wordnet-synset-2023 | [
"license:cc-by-4.0",
"region:us"
] | 2024-01-14T23:07:28+00:00 | {"license": "cc-by-4.0", "configs": [{"config_name": "default", "data_files": "open_english_wordnet_2023.jsonl"}]} | 2024-01-15T04:12:09+00:00 |
cb4b3afe726b52f57490629bb43606760a987c36 | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309)
| Marchanjo/spider-FIT-en-extra-3enr-1enb | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T23:15:12+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:42:35+00:00 |
6db40192dd3f4f10f8e5ec480a2af37870573ac9 | erikbtx/minhavoiizzzz | [
"license:openrail",
"region:us"
] | 2024-01-14T23:15:15+00:00 | {"license": "openrail"} | 2024-01-14T23:15:39+00:00 |
|
38760e8b8b09404f15ddfc3b26ab6b8101dc9e1c | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-FIT-pt | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T23:19:56+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:42:01+00:00 |
6f7c24e5a866830895c44dea61b5e6f5666e48f5 | marcelhuber/CompVis_integrals | [
"region:us"
] | 2024-01-14T23:20:31+00:00 | {} | 2024-01-14T23:51:55+00:00 |
|
9bd39793f22c099ecbe171fe3c7f13c2f9f80553 | marcelhuber/CompVis_predictions | [
"region:us"
] | 2024-01-14T23:21:08+00:00 | {} | 2024-01-14T23:29:56+00:00 |
|
649d7aa3b22a008a23ee9eada18bfcb9696773f7 | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-FIT-es | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T23:23:57+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:40:08+00:00 |
63ea2584bc2e0ad19e962927a4c866399b00263a | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-FIT-fr | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T23:26:40+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:40:24+00:00 |
fd425d27b2792e7766b856a248584475fafceb96 | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-FIT-en-pt-es-fr | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T23:29:44+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:40:40+00:00 |
0bb43ab6c704d5468568a112933e496c1a4ef8cc | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-FIT-en-pt-es-fr-extra-3enr-3ptr-3esr-3frr | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T23:33:15+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:41:16+00:00 |
4296e705777ea41a4fb2fb943458ab97ee7cc324 | Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) | Marchanjo/spider-FIT-en-pt-es-fr-enr-enb | [
"license:cc-by-sa-4.0",
"arxiv:2306.14256",
"arxiv:2110.03546",
"arxiv:2012.10309",
"region:us"
] | 2024-01-14T23:37:44+00:00 | {"license": "cc-by-sa-4.0"} | 2024-01-16T12:41:32+00:00 |
dc375029bfe036707d91075be93939d127f6fa0c | Arflas/wanted | [
"license:openrail",
"region:us"
] | 2024-01-14T23:43:16+00:00 | {"license": "openrail"} | 2024-01-14T23:44:06+00:00 |
|
ae1700982243110f59f761a63bacf19cce5a7fc3 |
# DATACLYSM PATCH 0.0.2: ARXIV
## USE THE NOTEBOOK TO GET STARTED!
https://github.com/somewheresystems/dataclysm

# somewheresystems/dataclysm-wikipedia-titles
This dataset comprises of 3,360,984 English language arXiv papers from the Cornell/arXiv dataset, with two new columns added: title-embeddings and abstract-embeddings. These additional columns were generated using the bge-small-en-v1.5 embeddings model. The dataset was sourced from the Cornell/arXiv GCP bucket's json manifest for arXiv metadata, as of January 14th, 2024 [gs://arxiv-dataset/metadata-v5/arxiv-metadata-oai.json](gs://arxiv-dataset/metadata-v5/arxiv-metadata-oai.json)
# Embeddings Model
We used https://huggingface.co/BAAI/bge-small-en-v1.5 to embed the `title` and `abstract` fields.
## Contact
Please contact [email protected] for inquiries. | somewheresystems/dataclysm-arxiv | [
"size_categories:1M<n<10M",
"language:en",
"license:cc0-1.0",
"arxiv",
"science",
"region:us"
] | 2024-01-14T23:51:58+00:00 | {"language": ["en"], "license": "cc0-1.0", "size_categories": ["1M<n<10M"], "pretty_name": "dataclysm-arxiv", "tags": ["arxiv", "science"]} | 2024-02-11T22:30:09+00:00 |
21dda751bf30b284045a55a95957c16b7bd80a90 | Hiraishin/ujianjpj-test-a | [
"license:apache-2.0",
"region:us"
] | 2024-01-15T00:01:29+00:00 | {"license": "apache-2.0"} | 2024-01-15T00:02:46+00:00 |
|
5b93a53603b82bc9ffd2fc5d9679261766ce1873 | Phreeeez/Alexeevafap | [
"region:us"
] | 2024-01-15T00:05:00+00:00 | {} | 2024-01-15T00:06:21+00:00 |
|
a3d826cf3c0f0fe71519a3760dd74647273f3fbd |

# End-To-End TEXT-2-ASMR with Transformers
This repository contains pretrained text2asmr model files, audio files and training+inference notebooks.
## Dataset Details
This unique dataset is tailored for training and deploying text-to-speech (TTS) systems specifically focused on ASMR (Autonomous Sensory Meridian Response) content. It includes a comprehensive collection of pretrained model files, audio files and training code suitable for TTS applications.
### Dataset Description
Inside this dataset, you shall find zipped folders as is follows:
1. **wavs_original:** original wav files as it was converted from the original video
2. **wavs:** original wav files broken into 1 minute chunks
3. **transcripts_original:** transribed scripts of the original wav files
4. **transcripts:** transribed scripts of the files in wav folder
5. **models:** text to spectrogram model trained on Glow-TTS
6. **ljspeech:** alignment files and respective checkpoint models (text to phoneme)
7. **transformer_tts_data.ljspeech**: trained checkpoint models and other files
And the following files:
1. **Glow-TTS.ipynb:** Training and inference code for GlowTTS models
2. **TransformerTTS.ipynb:** Training and inference code for Transformer models
3. **VITS_TTS.ipynb:** Optional code for training VITS models; follows the same format as GlowTTS
4. **metadata_original.csv:** ljspeech formatted transcriptions of wav_original folder; ready for TTS training
5. **metadata.csv:** ljspeech formatted transcriptions of wav folder; ready for TTS training
- **Curated by:** Alosh Denny, Anish S
- **Language(s) (NLP):** English
- **License:** MIT
### Dataset Sources
**Youtube:** Rebeccas ASMR, Nanou ASMR, Gibi ASMR, Cherie Lorraine ASMR, etc.
## Uses
The dataset can be used to train text2spec2mel, text2wav, and/or other end-to-end text-to-speech models.
### Direct Use
Pretrained models can be tested out with the TransformerTTS notebook and the Glow-TTS notebook.
## Dataset Card Authors
Alosh Denny, Anish S
## Dataset Card Contact
[email protected] | aoxo/text2asmr-uncensored | [
"task_categories:text-to-speech",
"task_categories:text-to-audio",
"size_categories:1K<n<10K",
"language:en",
"license:mit",
"code",
"music",
"doi:10.57967/hf/1610",
"region:us"
] | 2024-01-15T00:12:07+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["text-to-speech", "text-to-audio"], "pretty_name": "Text-to-ASMR", "image": ["https://ibb.co/ZzFkfWZ"], "tags": ["code", "music"]} | 2024-02-13T13:32:33+00:00 |
ad91832a34de99c38c22381ca7f081037310b1d3 | Vitorbr2009/ds-voz-ajuricaba | [
"license:openrail",
"region:us"
] | 2024-01-15T00:18:53+00:00 | {"license": "openrail"} | 2024-01-15T00:19:27+00:00 |
|
2e1bebe81cf48da3cebb81b4aff691d1da82496f | joaosanches/tedtalks_dataset_not_in_train | [
"region:us"
] | 2024-01-15T00:24:31+00:00 | {"dataset_info": {"features": [{"name": "pt", "dtype": "string"}, {"name": "pt-br", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 39396315, "num_examples": 187718}], "download_size": 25225794, "dataset_size": 39396315}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-30T22:24:04+00:00 |
|
a20f8c530fa0b13625be42a3393c030d96d0f018 | snowsense/food-images-1k | [
"task_categories:image-classification",
"size_categories:10B<n<100B",
"language:en",
"license:mit",
"Food",
"Dish",
"Chinese",
"region:us"
] | 2024-01-15T00:27:50+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["10B<n<100B"], "task_categories": ["image-classification"], "pretty_name": "Food Images 1K", "tags": ["Food", "Dish", "Chinese"]} | 2024-01-19T12:54:53+00:00 |
|
04348af14c0bc1c2dc16e4ac459a09470622d99e | SeanJIE250/llama2_law2 | [
"region:us"
] | 2024-01-15T00:48:21+00:00 | {} | 2024-01-15T01:00:37+00:00 |
|
d93a0c14c771534baab6301c6f977817344dbfa5 | erikbtx/BASEDERIKTRAING | [
"license:openrail",
"region:us"
] | 2024-01-15T00:57:31+00:00 | {"license": "openrail"} | 2024-01-15T00:57:55+00:00 |
|
76d2f604d27a27784ef087493f04faadcceb80eb |
# Dataset of yatadera_narumi/矢田寺成美 (Touhou)
This is the dataset of yatadera_narumi/矢田寺成美 (Touhou), containing 11 images and their tags.
The core tags of this character are `black_hair, braid, hat, long_hair, twin_braids, bangs, red_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 15.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yatadera_narumi_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 9.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yatadera_narumi_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 27 | 18.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yatadera_narumi_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 13.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yatadera_narumi_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 27 | 23.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yatadera_narumi_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yatadera_narumi_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, ajirogasa, grey_dress, long_sleeves, solo, red_capelet, buttons, looking_at_viewer, clothes_writing, smile, long_earlobes, own_hands_together, snowing, blush, open_mouth, closed_mouth, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | ajirogasa | grey_dress | long_sleeves | solo | red_capelet | buttons | looking_at_viewer | clothes_writing | smile | long_earlobes | own_hands_together | snowing | blush | open_mouth | closed_mouth | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:-------------|:---------------|:-------|:--------------|:----------|:--------------------|:------------------|:--------|:----------------|:---------------------|:----------|:--------|:-------------|:---------------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/yatadera_narumi_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-15T01:04:57+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-15T01:09:00+00:00 |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.