sha
stringlengths 40
40
| text
stringlengths 0
13.4M
| id
stringlengths 2
117
| tags
sequence | created_at
stringlengths 25
25
| metadata
stringlengths 2
31.7M
| last_modified
stringlengths 25
25
|
---|---|---|---|---|---|---|
3a22db985dd101a1f14030575159e7c52d944fed | ArchieMeng/20240217 | [
"region:us"
] | 2024-02-17T06:58:55+00:00 | {} | 2024-02-17T07:02:31+00:00 |
|
be7611934c4988fea6f5004b35315d3ac0641d76 | miittnnss/test-dataset | [
"region:us"
] | 2024-02-17T07:05:51+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "caption", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 18445.0, "num_examples": 2}], "download_size": 20023, "dataset_size": 18445.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T07:05:53+00:00 |
|
92e559aebea12ba1b83e8b30ab07241d6870052a | Mrfine/Mrfine | [
"region:us"
] | 2024-02-17T07:19:00+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4186564, "num_examples": 1000}], "download_size": 2245921, "dataset_size": 4186564}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T07:19:03+00:00 |
|
49e25f7cbc5a82e81dc7f43abe4cbf474f8d08df | Ataf/mini-platypus | [
"region:us"
] | 2024-02-17T07:19:21+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4186564, "num_examples": 1000}], "download_size": 2245921, "dataset_size": 4186564}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T07:19:23+00:00 |
|
b2be1c38d4361062fb2ad3421d553cc747169568 | Rahulenamala/mini-platypus | [
"region:us"
] | 2024-02-17T07:19:29+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4186564, "num_examples": 1000}], "download_size": 2245921, "dataset_size": 4186564}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T07:19:31+00:00 |
|
62dc34d0908f46f5837df12c3d51da0488cac2a9 | Yuga0530/mini-platypus | [
"region:us"
] | 2024-02-17T07:22:46+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4153564, "num_examples": 1000}], "download_size": 2240878, "dataset_size": 4153564}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T07:22:47+00:00 |
|
67fdf7b49f7c6d3e7a35286ee6c810668bc1bf94 |
# Dataset Card for Evaluation run of MaziyarPanahi/WizardLM-Math-70B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/WizardLM-Math-70B-v0.1](https://huggingface.co/MaziyarPanahi/WizardLM-Math-70B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T02:24:17.988962](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-v0.1/blob/main/results_2024-02-18T02-24-17.988962.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6914116069568377,
"acc_stderr": 0.03063431437342948,
"acc_norm": 0.6938613221179539,
"acc_norm_stderr": 0.031238741076549784,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5707095526544473,
"mc2_stderr": 0.01525040450448649
},
"harness|arc:challenge|25": {
"acc": 0.6322525597269625,
"acc_stderr": 0.014090995618168482,
"acc_norm": 0.6706484641638225,
"acc_norm_stderr": 0.013734057652635474
},
"harness|hellaswag|10": {
"acc": 0.6746664011153157,
"acc_stderr": 0.0046754187743142306,
"acc_norm": 0.8600876319458275,
"acc_norm_stderr": 0.0034618713240671846
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.04113914981189261,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.04113914981189261
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.02562085704293665,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.02562085704293665
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822502,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822502
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465953,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465953
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.02904560029061626,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.02904560029061626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.02626502460827588,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.02626502460827588
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4503311258278146,
"acc_stderr": 0.04062290018683776,
"acc_norm": 0.4503311258278146,
"acc_norm_stderr": 0.04062290018683776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8954128440366973,
"acc_stderr": 0.013120530245265593,
"acc_norm": 0.8954128440366973,
"acc_norm_stderr": 0.013120530245265593
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884565,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884565
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.757847533632287,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.757847533632287,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097655,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097655
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924974,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8697318007662835,
"acc_stderr": 0.012036729568216054,
"acc_norm": 0.8697318007662835,
"acc_norm_stderr": 0.012036729568216054
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7774566473988439,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.7774566473988439,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5553072625698324,
"acc_stderr": 0.016619881988177012,
"acc_norm": 0.5553072625698324,
"acc_norm_stderr": 0.016619881988177012
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.024739981355113592,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.024739981355113592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.02240967454730417,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.02240967454730417
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5390070921985816,
"acc_stderr": 0.029736592526424445,
"acc_norm": 0.5390070921985816,
"acc_norm_stderr": 0.029736592526424445
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5586701434159062,
"acc_stderr": 0.012682016335646683,
"acc_norm": 0.5586701434159062,
"acc_norm_stderr": 0.012682016335646683
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7242647058823529,
"acc_stderr": 0.027146271936625162,
"acc_norm": 0.7242647058823529,
"acc_norm_stderr": 0.027146271936625162
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.017242385828779627,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.017242385828779627
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5707095526544473,
"mc2_stderr": 0.01525040450448649
},
"harness|winogrande|5": {
"acc": 0.8176795580110497,
"acc_stderr": 0.010851565594267207
},
"harness|gsm8k|5": {
"acc": 0.6444275966641395,
"acc_stderr": 0.013185402252713852
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-v0.1 | [
"region:us"
] | 2024-02-17T07:29:49+00:00 | {"pretty_name": "Evaluation run of MaziyarPanahi/WizardLM-Math-70B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [MaziyarPanahi/WizardLM-Math-70B-v0.1](https://huggingface.co/MaziyarPanahi/WizardLM-Math-70B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-18T02:24:17.988962](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-v0.1/blob/main/results_2024-02-18T02-24-17.988962.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6914116069568377,\n \"acc_stderr\": 0.03063431437342948,\n \"acc_norm\": 0.6938613221179539,\n \"acc_norm_stderr\": 0.031238741076549784,\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5707095526544473,\n \"mc2_stderr\": 0.01525040450448649\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6322525597269625,\n \"acc_stderr\": 0.014090995618168482,\n \"acc_norm\": 0.6706484641638225,\n \"acc_norm_stderr\": 0.013734057652635474\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6746664011153157,\n \"acc_stderr\": 0.0046754187743142306,\n \"acc_norm\": 0.8600876319458275,\n \"acc_norm_stderr\": 0.0034618713240671846\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610337,\n \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610337\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.04113914981189261,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.04113914981189261\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4497354497354497,\n \"acc_stderr\": 0.02562085704293665,\n \"acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.02562085704293665\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822502,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822502\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465953,\n \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465953\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.02904560029061626,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.02904560029061626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.02626502460827588,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.02626502460827588\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4503311258278146,\n \"acc_stderr\": 0.04062290018683776,\n \"acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.04062290018683776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8954128440366973,\n \"acc_stderr\": 0.013120530245265593,\n \"acc_norm\": 0.8954128440366973,\n \"acc_norm_stderr\": 0.013120530245265593\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884565,\n \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884565\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.757847533632287,\n \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.757847533632287,\n \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097655,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097655\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n \"acc_stderr\": 0.019119892798924974,\n \"acc_norm\": 0.905982905982906,\n \"acc_norm_stderr\": 0.019119892798924974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8697318007662835,\n \"acc_stderr\": 0.012036729568216054,\n \"acc_norm\": 0.8697318007662835,\n \"acc_norm_stderr\": 0.012036729568216054\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7774566473988439,\n \"acc_stderr\": 0.02239421566194282,\n \"acc_norm\": 0.7774566473988439,\n \"acc_norm_stderr\": 0.02239421566194282\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5553072625698324,\n \"acc_stderr\": 0.016619881988177012,\n \"acc_norm\": 0.5553072625698324,\n \"acc_norm_stderr\": 0.016619881988177012\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.024739981355113592,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.024739981355113592\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.02240967454730417,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.02240967454730417\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5390070921985816,\n \"acc_stderr\": 0.029736592526424445,\n \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.029736592526424445\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5586701434159062,\n \"acc_stderr\": 0.012682016335646683,\n \"acc_norm\": 0.5586701434159062,\n \"acc_norm_stderr\": 0.012682016335646683\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625162,\n \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625162\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.017242385828779627,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.017242385828779627\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866767,\n \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866767\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5707095526544473,\n \"mc2_stderr\": 0.01525040450448649\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8176795580110497,\n \"acc_stderr\": 0.010851565594267207\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6444275966641395,\n \"acc_stderr\": 0.013185402252713852\n }\n}\n```", "repo_url": "https://huggingface.co/MaziyarPanahi/WizardLM-Math-70B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|arc:challenge|25_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|arc:challenge|25_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|arc:challenge|25_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|gsm8k|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|gsm8k|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|gsm8k|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hellaswag|10_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hellaswag|10_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hellaswag|10_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T07-27-21.521286.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T08-53-12.931134.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-18T02-24-17.988962.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["**/details_harness|winogrande|5_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["**/details_harness|winogrande|5_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["**/details_harness|winogrande|5_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-18T02-24-17.988962.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T07_27_21.521286", "path": ["results_2024-02-17T07-27-21.521286.parquet"]}, {"split": "2024_02_17T08_53_12.931134", "path": ["results_2024-02-17T08-53-12.931134.parquet"]}, {"split": "2024_02_18T02_24_17.988962", "path": ["results_2024-02-18T02-24-17.988962.parquet"]}, {"split": "latest", "path": ["results_2024-02-18T02-24-17.988962.parquet"]}]}]} | 2024-02-17T08:55:38+00:00 |
60afdcd6b7232f991f7f8ffa0be01e6736666863 | PulsarAI/test | [
"region:us"
] | 2024-02-17T07:38:26+00:00 | {} | 2024-02-17T08:44:33+00:00 |
|
e6b3e2b9d716273d20009c77f8c4364d0916702b | Harikrishnan46624/AI_QA_Data | [
"license:apache-2.0",
"region:us"
] | 2024-02-17T07:47:15+00:00 | {"license": "apache-2.0"} | 2024-02-17T14:28:49+00:00 |
|
18420199d1ef41b2391b93d531d54cd3cf9df9e7 | 63days/splat | [
"region:us"
] | 2024-02-17T07:49:32+00:00 | {} | 2024-02-17T08:32:36+00:00 |
|
b1dfb1b7a73f4ad4a1f2d4f36c2152d214d0b072 | Quangnguyen711/Fashion_Shop_Consultant | [
"task_categories:question-answering",
"size_categories:1K<n<10K",
"language:en",
"license:apache-2.0",
"finance",
"region:us"
] | 2024-02-17T07:49:33+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["question-answering"], "tags": ["finance"]} | 2024-02-17T07:51:02+00:00 |
|
e36b50d154bd69e7a484f0c1b2c858dc16276897 | benayas/massive_augmented_20pct_v0 | [
"region:us"
] | 2024-02-17T08:00:08+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "locale", "dtype": "string"}, {"name": "partition", "dtype": "string"}, {"name": "scenario", "dtype": "float64"}, {"name": "intent", "dtype": "float64"}, {"name": "utt", "dtype": "string"}, {"name": "annot_utt", "dtype": "string"}, {"name": "worker_id", "dtype": "string"}, {"name": "slot_method", "struct": [{"name": "method", "sequence": "null"}, {"name": "slot", "sequence": "null"}]}, {"name": "judgments", "struct": [{"name": "grammar_score", "sequence": "int8"}, {"name": "intent_score", "sequence": "int8"}, {"name": "language_identification", "sequence": "null"}, {"name": "slots_score", "sequence": "int8"}, {"name": "spelling_score", "sequence": "int8"}, {"name": "worker_id", "sequence": "null"}]}, {"name": "category", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1763554, "num_examples": 11514}], "download_size": 475194, "dataset_size": 1763554}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T08:00:10+00:00 |
|
ffcb31158f9bd9f72abcc5fd2add91e4f2eaa11a | alisson40889/global | [
"license:openrail",
"region:us"
] | 2024-02-17T08:15:12+00:00 | {"license": "openrail"} | 2024-02-17T08:16:17+00:00 |
|
2258470247657e714d29ce5e3c5bec82ed0c911b | DaviG117/test | [
"region:us"
] | 2024-02-17T08:16:03+00:00 | {} | 2024-02-17T08:29:01+00:00 |
|
8b926400b0641bba3feae66deb5fbee37de80343 | Atipico1/nq-test-adv-replace-v3 | [
"region:us"
] | 2024-02-17T08:19:38+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "entity", "dtype": "string"}, {"name": "similar_entity", "dtype": "string"}, {"name": "answers", "sequence": "string"}, {"name": "ctxs", "list": [{"name": "hasanswer", "dtype": "bool"}, {"name": "score", "dtype": "float64"}, {"name": "text", "dtype": "string"}, {"name": "title", "dtype": "string"}]}, {"name": "masked_query", "dtype": "string"}, {"name": "original_case", "list": [{"name": "answer", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "distance", "dtype": "string"}, {"name": "original_answers", "sequence": "string"}, {"name": "question", "dtype": "string"}]}, {"name": "unans_case", "list": [{"name": "answer", "dtype": "string"}, {"name": "answers", "sequence": "string"}, {"name": "context", "dtype": "string"}, {"name": "distance", "dtype": "string"}, {"name": "original_answers", "sequence": "string"}, {"name": "question", "dtype": "string"}]}, {"name": "conflict_case", "list": [{"name": "answer", "dtype": "string"}, {"name": "conflict_context", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "distance", "dtype": "string"}, {"name": "original_answers", "sequence": "string"}, {"name": "question", "dtype": "string"}]}, {"name": "context", "dtype": "string"}, {"name": "context_vague", "dtype": "string"}, {"name": "entities", "dtype": "string"}, {"name": "entities_count", "dtype": "int64"}, {"name": "adv_sent", "dtype": "string"}, {"name": "adv_passage", "dtype": "string"}, {"name": "hasanswer", "dtype": "bool"}, {"name": "is_adversarial", "dtype": "bool"}], "splits": [{"name": "test", "num_bytes": 57386242, "num_examples": 3610}], "download_size": 32792526, "dataset_size": 57386242}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}]} | 2024-02-17T08:19:46+00:00 |
|
7975f71b5187817dc0a32ce9af9b7df564c8ecc2 | azlan8289/Book_Genre | [
"license:apache-2.0",
"region:us"
] | 2024-02-17T08:36:42+00:00 | {"license": "apache-2.0"} | 2024-02-17T08:37:58+00:00 |
|
8f2c6956b433a407e60ca10e6998723ebf3d527f | Atipico1/nq-test-valid_adv_passage | [
"region:us"
] | 2024-02-17T08:37:41+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "entity", "dtype": "string"}, {"name": "similar_entity", "dtype": "string"}, {"name": "answers", "sequence": "string"}, {"name": "ctxs", "list": [{"name": "hasanswer", "dtype": "bool"}, {"name": "score", "dtype": "float64"}, {"name": "text", "dtype": "string"}, {"name": "title", "dtype": "string"}]}, {"name": "masked_query", "dtype": "string"}, {"name": "original_case", "list": [{"name": "answer", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "distance", "dtype": "string"}, {"name": "original_answers", "sequence": "string"}, {"name": "question", "dtype": "string"}]}, {"name": "unans_case", "list": [{"name": "answer", "dtype": "string"}, {"name": "answers", "sequence": "string"}, {"name": "context", "dtype": "string"}, {"name": "distance", "dtype": "string"}, {"name": "original_answers", "sequence": "string"}, {"name": "question", "dtype": "string"}]}, {"name": "conflict_case", "list": [{"name": "answer", "dtype": "string"}, {"name": "conflict_context", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "distance", "dtype": "string"}, {"name": "original_answers", "sequence": "string"}, {"name": "question", "dtype": "string"}]}, {"name": "context", "dtype": "string"}, {"name": "context_vague", "dtype": "string"}, {"name": "entities", "dtype": "string"}, {"name": "entities_count", "dtype": "int64"}, {"name": "adv_sent", "dtype": "string"}, {"name": "adv_passage", "dtype": "string"}, {"name": "cos_sim", "dtype": "float64"}, {"name": "answer_match", "dtype": "bool"}, {"name": "is_valid_adversary", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 58428413, "num_examples": 3610}], "download_size": 33883766, "dataset_size": 58428413}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T08:37:47+00:00 |
|
4b4d224eeb0d298a520faadaf4253d9f42f75635 | fattahharith/sejarah-stpm | [
"region:us"
] | 2024-02-17T08:39:41+00:00 | {} | 2024-02-17T08:40:43+00:00 |
|
4cc176b64dfb92a5b388dfe177d90f7f8ad5bf9e | JachinL/data2 | [
"region:us"
] | 2024-02-17T08:42:01+00:00 | {} | 2024-02-17T08:43:00+00:00 |
|
2c8c6a2c1ebf6af865f2a9cf77543a1257e783d5 | GGital/Signal_Test07 | [
"region:us"
] | 2024-02-17T08:42:18+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "0", "1": "1", "2": "2", "3": "3", "4": "4", "5": "5", "6": "6"}}}}], "splits": [{"name": "train", "num_bytes": 264257993.609, "num_examples": 3507}], "download_size": 260755282, "dataset_size": 264257993.609}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T08:42:51+00:00 |
|
49d28a3c8829e47a84e666cb114f0e9085438b29 | MesutUnutur/guanaco-llama2-1k | [
"region:us"
] | 2024-02-17T08:44:46+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1654448, "num_examples": 1000}], "download_size": 966692, "dataset_size": 1654448}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T08:44:47+00:00 |
|
30db011f3f68834ded0d35c6fc0fda03d7a714d2 | Bhagya17/Summarization_5.0 | [
"region:us"
] | 2024-02-17T08:46:07+00:00 | {} | 2024-02-17T08:46:28+00:00 |
|
2b56ed8cddbe24cdce776115e6a366b7a224f877 | nizamovtimur/wikiutmn-study-gigachat | [
"license:mit",
"region:us"
] | 2024-02-17T08:51:43+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "document", "dtype": "string"}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 473636, "num_examples": 251}], "download_size": 84353, "dataset_size": 473636}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T09:36:31+00:00 |
|
bd51a77387e0f9637a2d97d211004c9e32f2cff8 | Anonymous0591/C3PSKT | [
"license:mit",
"region:us"
] | 2024-02-17T09:02:26+00:00 | {"license": "mit"} | 2024-02-17T09:02:26+00:00 |
|
4548bcf07b77c84a36fbbeb136ffb4dd4978a2c7 |
# Dataset Card for Evaluation run of Kquant03/NeuralTrix-7B-dpo-relaser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/NeuralTrix-7B-dpo-relaser](https://huggingface.co/Kquant03/NeuralTrix-7B-dpo-relaser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-relaser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T09:02:39.812409](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-relaser/blob/main/results_2024-02-17T09-02-39.812409.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6457696459415244,
"acc_stderr": 0.032218510960558784,
"acc_norm": 0.6454366964555287,
"acc_norm_stderr": 0.032889882572498676,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7797569119907752,
"mc2_stderr": 0.01376117718949703
},
"harness|arc:challenge|25": {
"acc": 0.6885665529010239,
"acc_stderr": 0.013532472099850947,
"acc_norm": 0.7133105802047781,
"acc_norm_stderr": 0.013214986329274772
},
"harness|hellaswag|10": {
"acc": 0.6978689504082852,
"acc_stderr": 0.004582433109636472,
"acc_norm": 0.8840868352917746,
"acc_norm_stderr": 0.0031946652660786025
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546954,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546954
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608452,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608452
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092427,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092427
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.016593394227564843,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.016593394227564843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7797569119907752,
"mc2_stderr": 0.01376117718949703
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.6815769522365428,
"acc_stderr": 0.012832225723075408
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-relaser | [
"region:us"
] | 2024-02-17T09:05:00+00:00 | {"pretty_name": "Evaluation run of Kquant03/NeuralTrix-7B-dpo-relaser", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/NeuralTrix-7B-dpo-relaser](https://huggingface.co/Kquant03/NeuralTrix-7B-dpo-relaser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-relaser\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T09:02:39.812409](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-relaser/blob/main/results_2024-02-17T09-02-39.812409.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6457696459415244,\n \"acc_stderr\": 0.032218510960558784,\n \"acc_norm\": 0.6454366964555287,\n \"acc_norm_stderr\": 0.032889882572498676,\n \"mc1\": 0.627906976744186,\n \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7797569119907752,\n \"mc2_stderr\": 0.01376117718949703\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6885665529010239,\n \"acc_stderr\": 0.013532472099850947,\n \"acc_norm\": 0.7133105802047781,\n \"acc_norm_stderr\": 0.013214986329274772\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6978689504082852,\n \"acc_stderr\": 0.004582433109636472,\n \"acc_norm\": 0.8840868352917746,\n \"acc_norm_stderr\": 0.0031946652660786025\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546954,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546954\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608452,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608452\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092427,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092427\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7797569119907752,\n \"mc2_stderr\": 0.01376117718949703\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6815769522365428,\n \"acc_stderr\": 0.012832225723075408\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/NeuralTrix-7B-dpo-relaser", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|arc:challenge|25_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|gsm8k|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hellaswag|10_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T09-02-39.812409.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["**/details_harness|winogrande|5_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T09-02-39.812409.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T09_02_39.812409", "path": ["results_2024-02-17T09-02-39.812409.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T09-02-39.812409.parquet"]}]}]} | 2024-02-17T09:05:22+00:00 |
2790f707bb56e84073bd13d330f001a06859ad8d | # Dataset Card for "RefGPT-Fact-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Mutonix/RefGPT-Fact-v2 | [
"region:us"
] | 2024-02-17T09:05:53+00:00 | {"dataset_info": {"features": [{"name": "dialogue", "dtype": "string"}, {"name": "reference", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "type", "dtype": "string"}], "splits": [{"name": "zh", "num_bytes": 605149267, "num_examples": 143154}, {"name": "en", "num_bytes": 4303277735, "num_examples": 522354}], "download_size": 599049240, "dataset_size": 4908427002}} | 2024-02-17T13:04:19+00:00 |
6e005eaca403a715bf0f1561a72a9ebcac3483dc | andersonbcdefg/misc_sts_pairs_v2 | [
"region:us"
] | 2024-02-17T09:08:18+00:00 | {"dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "pos", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}, {"name": "jaccard", "dtype": "float64"}, {"name": "sim", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 1859391440.7831883, "num_examples": 13184276}], "download_size": 1277788105, "dataset_size": 1859391440.7831883}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T09:25:43+00:00 |
|
975a5a58fa42f9fa11249f888d823b156a34b74e | from diffusers import DiffusionPipeline
pipeline = DiffusionPipeline.from_pretrained("stabilityai/stable-video-diffusion-img2vid-xt") | dsplz/dasdasd | [
"region:us"
] | 2024-02-17T09:22:56+00:00 | {} | 2024-02-17T09:23:05+00:00 |
31ba481e65d4d7262f10beeab13ea7a405e5241d | PreciousT/mphs | [
"region:us"
] | 2024-02-17T09:27:59+00:00 | {"dataset_info": {"features": [{"name": "ID", "dtype": "string"}, {"name": "Question Text", "dtype": "string"}, {"name": "Question Answer", "dtype": "string"}, {"name": "Reference Document", "dtype": "string"}, {"name": "Paragraph(s) Number", "dtype": "string"}, {"name": "Keywords", "dtype": "string"}, {"name": "instruction", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 437265, "num_examples": 748}], "download_size": 210581, "dataset_size": 437265}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T09:28:00+00:00 |
|
033894b457da975034134d223716d6300ce43468 | grishi118/evm_question | [
"region:us"
] | 2024-02-17T09:37:59+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 743381, "num_examples": 2167}], "download_size": 300157, "dataset_size": 743381}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T09:38:00+00:00 |
|
4646a7d3b5383437f10eb5f2ea590d15b66f72b9 | # Degarbayan-SC: A Colloquial Paraphrase Farsi using pre-trained mT5
This is the Dataset of [Degarbayan-SC paper](https://ieeexplore.ieee.org/abstract/document/9959983).
You can Finetune with this dataset on the transformers models using [Github](https://https://github.com/m0javad/Degarbayan-SC).
```python
from datasets import load_dataset
dataset = load_dataset("m0javad/Degarbayan-SC-dataset")
```
### Statistic

our sentence length distribution is between 3 and 19 words and sentences are an average of 8 words. This makes sense because in the movie subtitles, sentences are shown in a range of time and we matched them with timespans. Humans can say a certain number of words in a certain period of time. Our collected sentences have 128,699 unique words.
as you see in the table above, our dataset contains a large number of paraphrasing sentences in various forms such syntactic, semantic and conceptual paraphrases.
### contact
contact me for contribution and future possible works at: [email protected] | m0javad/Degarbayan-SC-dataset | [
"task_categories:text-generation",
"task_categories:conversational",
"task_categories:text2text-generation",
"size_categories:100M<n<1B",
"language:fa",
"region:us"
] | 2024-02-17T09:49:03+00:00 | {"language": ["fa"], "size_categories": ["100M<n<1B"], "task_categories": ["text-generation", "conversational", "text2text-generation"]} | 2024-02-17T10:00:27+00:00 |
81aca92e98b367fc6c9ffa9dca38fe453373c942 | lvdthieu/compilable_rate-v2 | [
"region:us"
] | 2024-02-17T09:50:26+00:00 | {} | 2024-02-17T09:56:04+00:00 |
|
4cd2ff54497a83d2d1f24cce672e7d38ba20b850 | # Strix

134k question-answer pairs based on [AiresPucrs'](https://huggingface.co/datasets/AiresPucrs/stanford-encyclopedia-philosophy) [stanford-encyclopedia-philosophy](https://huggingface.co/datasets/AiresPucrs/stanford-encyclopedia-philosophy) dataset. | sayhan/strix-philosophy-qa | [
"task_categories:question-answering",
"language:en",
"license:apache-2.0",
"philosophy",
"region:us"
] | 2024-02-17T09:50:32+00:00 | {"language": ["en"], "license": "apache-2.0", "task_categories": ["question-answering"], "tags": ["philosophy"]} | 2024-02-17T10:34:28+00:00 |
b7608b1f149519dcfb3d84706bcbf09f69878b8c |
This is a dataset that was created from [HuggingFaceH4/OpenHermes-2.5-1k-longest](https://huggingface.co/datasets/HuggingFaceH4/OpenHermes-2.5-1k-longest).
The purpose is to be able to use in [axolotl](https://github.com/OpenAccess-AI-Collective/axolotl) config by adding:
```yaml
datasets:
- path: Mihaiii/OpenHermes-2.5-1k-longest-curated
type: alpaca
```
I elimininated rows that:
1) Had sys prompt (only 3 rows eliminated)
2) Contained on output a character that is repeated 10 times in a row (478 rows eliminated)
So from a 1000 rows dataset, I ended up with a 519 rows dataset.
See the [OpenHermes-2.5-1k-longest-curated.ipynb](https://huggingface.co/datasets/Mihaiii/OpenHermes-2.5-1k-longest-curated/blob/main/OpenHermes-2.5-1k-longest-curated.ipynb) notebook for details on how the dataset was constructed.
**Later edit**: after a more in depth analysis on the dataset, I noticed that:
1) The imported subset is `test_sft`, but this is the 2nd chunk of top 1k records. The first one is in `train_sft` subset.
2) Valid code records that contained 10 repeated spaces for indentation were also eliminated. | Mihaiii/OpenHermes-2.5-1k-longest-curated | [
"region:us"
] | 2024-02-17T09:52:59+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4176433, "num_examples": 519}], "download_size": 1835764, "dataset_size": 4176433}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T12:36:56+00:00 |
089cacdf1b0d94fff165f54d02c0c188f75abd8d | maywell/kiqu_finaltune | [
"region:us"
] | 2024-02-17T09:58:30+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2006869, "num_examples": 997}], "download_size": 1135101, "dataset_size": 2006869}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T09:58:32+00:00 |
|
dab2692ca42a3ec59762114f7719d046296edbb9 |
# Dataset Card for Evaluation run of freeCS-dot-org/OpenAGI-testing-truthyDPO-1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [freeCS-dot-org/OpenAGI-testing-truthyDPO-1](https://huggingface.co/freeCS-dot-org/OpenAGI-testing-truthyDPO-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-truthyDPO-1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T10:15:19.081933](https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-truthyDPO-1/blob/main/results_2024-02-17T10-15-19.081933.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6304857303424044,
"acc_stderr": 0.03249871750345685,
"acc_norm": 0.635809758335016,
"acc_norm_stderr": 0.033161164304669005,
"mc1": 0.5361077111383109,
"mc1_stderr": 0.017457800422268625,
"mc2": 0.7112471524136447,
"mc2_stderr": 0.014852535681165156
},
"harness|arc:challenge|25": {
"acc": 0.6322525597269625,
"acc_stderr": 0.014090995618168478,
"acc_norm": 0.6732081911262798,
"acc_norm_stderr": 0.01370666597558733
},
"harness|hellaswag|10": {
"acc": 0.6648078072097192,
"acc_stderr": 0.004710928569985755,
"acc_norm": 0.8598884684325832,
"acc_norm_stderr": 0.003463933286063884
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.02503387058301518,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.02503387058301518
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094767,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094767
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887044,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887044
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.015990154885073406,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.015990154885073406
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106603,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106603
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45045632333767927,
"acc_stderr": 0.012707390438502346,
"acc_norm": 0.45045632333767927,
"acc_norm_stderr": 0.012707390438502346
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983576,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696647,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696647
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5361077111383109,
"mc1_stderr": 0.017457800422268625,
"mc2": 0.7112471524136447,
"mc2_stderr": 0.014852535681165156
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435091
},
"harness|gsm8k|5": {
"acc": 0.3707354056103108,
"acc_stderr": 0.01330426770545843
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-truthyDPO-1 | [
"region:us"
] | 2024-02-17T10:17:35+00:00 | {"pretty_name": "Evaluation run of freeCS-dot-org/OpenAGI-testing-truthyDPO-1", "dataset_summary": "Dataset automatically created during the evaluation run of model [freeCS-dot-org/OpenAGI-testing-truthyDPO-1](https://huggingface.co/freeCS-dot-org/OpenAGI-testing-truthyDPO-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-truthyDPO-1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T10:15:19.081933](https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-truthyDPO-1/blob/main/results_2024-02-17T10-15-19.081933.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6304857303424044,\n \"acc_stderr\": 0.03249871750345685,\n \"acc_norm\": 0.635809758335016,\n \"acc_norm_stderr\": 0.033161164304669005,\n \"mc1\": 0.5361077111383109,\n \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.7112471524136447,\n \"mc2_stderr\": 0.014852535681165156\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6322525597269625,\n \"acc_stderr\": 0.014090995618168478,\n \"acc_norm\": 0.6732081911262798,\n \"acc_norm_stderr\": 0.01370666597558733\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6648078072097192,\n \"acc_stderr\": 0.004710928569985755,\n \"acc_norm\": 0.8598884684325832,\n \"acc_norm_stderr\": 0.003463933286063884\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094767,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094767\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887044,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887044\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.015990154885073406,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.015990154885073406\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n \"acc_stderr\": 0.016204672385106603,\n \"acc_norm\": 0.376536312849162,\n \"acc_norm_stderr\": 0.016204672385106603\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45045632333767927,\n \"acc_stderr\": 0.012707390438502346,\n \"acc_norm\": 0.45045632333767927,\n \"acc_norm_stderr\": 0.012707390438502346\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983576,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983576\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696647,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696647\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5361077111383109,\n \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.7112471524136447,\n \"mc2_stderr\": 0.014852535681165156\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3707354056103108,\n \"acc_stderr\": 0.01330426770545843\n }\n}\n```", "repo_url": "https://huggingface.co/freeCS-dot-org/OpenAGI-testing-truthyDPO-1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|arc:challenge|25_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|gsm8k|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hellaswag|10_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T10-15-19.081933.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["**/details_harness|winogrande|5_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T10-15-19.081933.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T10_15_19.081933", "path": ["results_2024-02-17T10-15-19.081933.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T10-15-19.081933.parquet"]}]}]} | 2024-02-17T10:17:55+00:00 |
e620e4336aff109feb588cc3b3c2a41d8617d196 | # What Is This
これは [RVC-Models](https://huggingface.co/kuwacom/RVC-Models) のモデルの学習に利用したデータセットの生データです。
利用する際は学習環境にあったデータに変換して利用してください。
# About Dataset
## yukkuri
ゆっくりムービーメーカー4でChatGPTを利用して生成した日本語の文章を約100分作成した動画です。 | kuwacom/Character-Dataset | [
"license:mit",
"region:us"
] | 2024-02-17T10:25:03+00:00 | {"license": "mit", "pretty_name": "d"} | 2024-02-17T13:14:04+00:00 |
977c295328baa4ac59b4409a3c43771d65b219e2 |
# Dataset Card for Evaluation run of freeCS-dot-org/OpenAGI-testing-intelDPO-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [freeCS-dot-org/OpenAGI-testing-intelDPO-2](https://huggingface.co/freeCS-dot-org/OpenAGI-testing-intelDPO-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-intelDPO-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T10:25:04.402687](https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-intelDPO-2/blob/main/results_2024-02-17T10-25-04.402687.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6269237223761134,
"acc_stderr": 0.03285871940715557,
"acc_norm": 0.6302370765791581,
"acc_norm_stderr": 0.03352050314315757,
"mc1": 0.40636474908200737,
"mc1_stderr": 0.0171938358120939,
"mc2": 0.5828319150839333,
"mc2_stderr": 0.015305820815113032
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.01437035863247244,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844463
},
"harness|hellaswag|10": {
"acc": 0.6442939653455487,
"acc_stderr": 0.004777483159634023,
"acc_norm": 0.8463453495319657,
"acc_norm_stderr": 0.00359880385546063
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.02582210611941589,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.02582210611941589
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.024784316942156395,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.024784316942156395
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203624,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02886743144984932,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02886743144984932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.02636165166838909,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.02636165166838909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069723,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069723
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914742,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914742
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599924,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599924
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40636474908200737,
"mc1_stderr": 0.0171938358120939,
"mc2": 0.5828319150839333,
"mc2_stderr": 0.015305820815113032
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223178
},
"harness|gsm8k|5": {
"acc": 0.5094768764215315,
"acc_stderr": 0.01377001065116882
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-intelDPO-2 | [
"region:us"
] | 2024-02-17T10:27:23+00:00 | {"pretty_name": "Evaluation run of freeCS-dot-org/OpenAGI-testing-intelDPO-2", "dataset_summary": "Dataset automatically created during the evaluation run of model [freeCS-dot-org/OpenAGI-testing-intelDPO-2](https://huggingface.co/freeCS-dot-org/OpenAGI-testing-intelDPO-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-intelDPO-2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T10:25:04.402687](https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-intelDPO-2/blob/main/results_2024-02-17T10-25-04.402687.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6269237223761134,\n \"acc_stderr\": 0.03285871940715557,\n \"acc_norm\": 0.6302370765791581,\n \"acc_norm_stderr\": 0.03352050314315757,\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.0171938358120939,\n \"mc2\": 0.5828319150839333,\n \"mc2_stderr\": 0.015305820815113032\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.01437035863247244,\n \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844463\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6442939653455487,\n \"acc_stderr\": 0.004777483159634023,\n \"acc_norm\": 0.8463453495319657,\n \"acc_norm_stderr\": 0.00359880385546063\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n \"acc_stderr\": 0.02582210611941589,\n \"acc_norm\": 0.7096774193548387,\n \"acc_norm_stderr\": 0.02582210611941589\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.024784316942156395,\n \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.024784316942156395\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203624,\n \"acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203624\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.02636165166838909,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.02636165166838909\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294407,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294407\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069723,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069723\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n \"acc_stderr\": 0.016435865260914742,\n \"acc_norm\": 0.40782122905027934,\n \"acc_norm_stderr\": 0.016435865260914742\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n \"acc_stderr\": 0.012687818419599924,\n \"acc_norm\": 0.44328552803129073,\n \"acc_norm_stderr\": 0.012687818419599924\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.01943177567703731,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.01943177567703731\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.0171938358120939,\n \"mc2\": 0.5828319150839333,\n \"mc2_stderr\": 0.015305820815113032\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223178\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5094768764215315,\n \"acc_stderr\": 0.01377001065116882\n }\n}\n```", "repo_url": "https://huggingface.co/freeCS-dot-org/OpenAGI-testing-intelDPO-2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|arc:challenge|25_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|gsm8k|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hellaswag|10_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T10-25-04.402687.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["**/details_harness|winogrande|5_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T10-25-04.402687.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T10_25_04.402687", "path": ["results_2024-02-17T10-25-04.402687.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T10-25-04.402687.parquet"]}]}]} | 2024-02-17T10:27:44+00:00 |
eb2c7438edbabac6a9b7a3ab9464de4f30b66d79 | Antonio88/Talistran-Dataset | [
"license:apache-2.0",
"region:us"
] | 2024-02-17T10:37:35+00:00 | {"license": "apache-2.0"} | 2024-02-17T12:52:16+00:00 |
|
e1b378589c85e0ef8cb85f21bc7f120e92dce08f | benayas/massive_augmented_5pct_v1 | [
"region:us"
] | 2024-02-17T10:38:00+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "locale", "dtype": "string"}, {"name": "partition", "dtype": "string"}, {"name": "scenario", "dtype": "float64"}, {"name": "intent", "dtype": "float64"}, {"name": "utt", "dtype": "string"}, {"name": "annot_utt", "dtype": "string"}, {"name": "worker_id", "dtype": "string"}, {"name": "slot_method", "struct": [{"name": "method", "sequence": "null"}, {"name": "slot", "sequence": "null"}]}, {"name": "judgments", "struct": [{"name": "grammar_score", "sequence": "int8"}, {"name": "intent_score", "sequence": "int8"}, {"name": "language_identification", "sequence": "null"}, {"name": "slots_score", "sequence": "int8"}, {"name": "spelling_score", "sequence": "int8"}, {"name": "worker_id", "sequence": "null"}]}, {"name": "category", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1660855, "num_examples": 11514}], "download_size": 398336, "dataset_size": 1660855}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T10:38:02+00:00 |
|
d5388dfbdeb29953d5cd10e735755e28d39cad94 | Dev2410/CR2 | [
"license:apache-2.0",
"region:us"
] | 2024-02-17T10:48:01+00:00 | {"license": "apache-2.0"} | 2024-02-17T11:06:06+00:00 |
|
b55c5525bb0b7b0634af996e0137a3b76ef69550 | Atipico1/nq-test-valid-adversary-replace | [
"region:us"
] | 2024-02-17T10:49:19+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "entity", "dtype": "string"}, {"name": "similar_entity", "dtype": "string"}, {"name": "answers", "sequence": "string"}, {"name": "ctxs", "list": [{"name": "hasanswer", "dtype": "bool"}, {"name": "score", "dtype": "float64"}, {"name": "text", "dtype": "string"}, {"name": "title", "dtype": "string"}]}, {"name": "masked_query", "dtype": "string"}, {"name": "original_case", "list": [{"name": "answer", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "distance", "dtype": "string"}, {"name": "original_answers", "sequence": "string"}, {"name": "question", "dtype": "string"}]}, {"name": "unans_case", "list": [{"name": "answer", "dtype": "string"}, {"name": "answers", "sequence": "string"}, {"name": "context", "dtype": "string"}, {"name": "distance", "dtype": "string"}, {"name": "original_answers", "sequence": "string"}, {"name": "question", "dtype": "string"}]}, {"name": "conflict_case", "list": [{"name": "answer", "dtype": "string"}, {"name": "conflict_context", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "distance", "dtype": "string"}, {"name": "original_answers", "sequence": "string"}, {"name": "question", "dtype": "string"}]}, {"name": "context", "dtype": "string"}, {"name": "context_vague", "dtype": "string"}, {"name": "entities", "dtype": "string"}, {"name": "entities_count", "dtype": "int64"}, {"name": "adv_sent", "dtype": "string"}, {"name": "adv_passage", "dtype": "string"}, {"name": "cos_sim", "dtype": "float64"}, {"name": "answer_match", "dtype": "bool"}, {"name": "is_valid_adversary", "dtype": "bool"}, {"name": "hasanswer", "dtype": "bool"}, {"name": "is_adversarial", "dtype": "bool"}], "splits": [{"name": "test", "num_bytes": 58345319, "num_examples": 3610}], "download_size": 34093569, "dataset_size": 58345319}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}]} | 2024-02-17T10:49:26+00:00 |
|
3ea4553075f4ce21a0cc0baab3ef0d0cabad2e93 | maghwa/OpenHermes-2-AR-10K-28-700k-710k | [
"region:us"
] | 2024-02-17T10:51:22+00:00 | {"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "hash", "dtype": "null"}, {"name": "category", "dtype": "null"}, {"name": "system_prompt", "dtype": "null"}, {"name": "model_name", "dtype": "null"}, {"name": "language", "dtype": "null"}, {"name": "views", "dtype": "float64"}, {"name": "conversations", "dtype": "string"}, {"name": "topic", "dtype": "null"}, {"name": "id", "dtype": "null"}, {"name": "avatarUrl", "dtype": "null"}, {"name": "custom_instruction", "dtype": "null"}, {"name": "skip_prompt_formatting", "dtype": "null"}, {"name": "idx", "dtype": "null"}, {"name": "title", "dtype": "null"}, {"name": "model", "dtype": "null"}], "splits": [{"name": "train", "num_bytes": 25266234, "num_examples": 10001}], "download_size": 11486043, "dataset_size": 25266234}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T10:51:24+00:00 |
|
33b71c67632904c145c65c85ccf996d84a132507 | vaidehikale24/sample_data | [
"region:us"
] | 2024-02-17T10:54:36+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20697.1875, "num_examples": 33}, {"name": "test", "num_bytes": 9407.8125, "num_examples": 15}], "download_size": 41615, "dataset_size": 30105.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-17T11:09:33+00:00 |
|
dac2ef99ee0eba53a75e77c68d80eb1b6b8f1c91 | ptd150101/infore | [
"region:us"
] | 2024-02-17T10:58:45+00:00 | {} | 2024-02-17T11:02:43+00:00 |
|
ff3eba76d4a124c860e6c161ddbd9c5454ee7227 | Kyudan/MathAccess_GTNT311K | [
"region:us"
] | 2024-02-17T11:04:32+00:00 | {} | 2024-02-17T11:59:16+00:00 |
|
154a2f22f74a9b6023a6f4687218b3f39cb0e741 | CesarChaMal/my-personal-model | [
"region:us"
] | 2024-02-17T11:11:53+00:00 | {"dataset_info": {"features": [{"name": "train", "struct": [{"name": "text", "sequence": "string"}]}, {"name": "test", "struct": [{"name": "text", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 45371, "num_examples": 1}], "download_size": 37247, "dataset_size": 45371}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T11:11:58+00:00 |
|
1d829104879065d0d0f2fd6521b827ca8767b063 | kalabedo/meum | [
"region:us"
] | 2024-02-17T11:15:57+00:00 | {} | 2024-02-17T11:30:14+00:00 |
|
d5e1a9b9d28c3df70bbd0ff3f5dfbd5c7e0b83d5 | chronopt-research/vggface2 | [
"region:us"
] | 2024-02-17T11:17:32+00:00 | {} | 2024-02-17T17:40:05+00:00 |
|
b7630b7df89149147dddbd0fa0b3775fb3622de2 | shajiu/ParallelCorpusSFT | [
"license:apache-2.0",
"region:us"
] | 2024-02-17T11:27:36+00:00 | {"license": "apache-2.0"} | 2024-02-17T11:37:10+00:00 |
|
2b8fd0d01eb9cb26d06ee4c3083c2707df9526ca | bertram-gilfoyle/CC-MAIN-2023-06 | [
"region:us"
] | 2024-02-17T11:44:26+00:00 | {} | 2024-02-17T15:33:13+00:00 |
|
1c9ea2a655daaa0894c5839114b61a8557c84e8d | cmcmaster/rheumatology-vignettes-prefs | [
"region:us"
] | 2024-02-17T11:46:36+00:00 | {"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "generation_model", "sequence": "string"}, {"name": "generation_prompt", "list": {"list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}}, {"name": "raw_generation_responses", "sequence": "string"}, {"name": "generations", "sequence": "string"}, {"name": "labelling_model", "dtype": "string"}, {"name": "labelling_prompt", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "raw_labelling_response", "dtype": "string"}, {"name": "rating", "sequence": "float64"}, {"name": "rationale", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 3916648, "num_examples": 165}], "download_size": 1463137, "dataset_size": 3916648}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T11:46:42+00:00 |
|
9a25594de460d8e2daaaa950915ff55e1d654e2c | AsphyXIA/Baarat-Kan-QA | [
"region:us"
] | 2024-02-17T12:02:31+00:00 | {"dataset_info": {"features": [{"name": "answer", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 69642578, "num_examples": 99544}], "download_size": 26721665, "dataset_size": 69642578}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T12:02:33+00:00 |
|
f517da1233a70b6127468ede8f1d000adbd58206 |
# phrase-ticker Dataset
## Description
The Phrase Ticker Dataset enables the extraction of stock ticker symbols from natural language queries. The dataset pairs NL utterances commonly associated with S&P 500 companies with their corresponding ticker symbols, providing a simple resource for understanding how companies are referred to in various contexts.
## Structure
The dataset comprises two columns:
- `phrase`: This column contains natural language phrases that reference or describe companies in ways that are commonly used in financial news, reports, and discussions. These include not only formal company names and products but also informal and colloquial references.
- `ticker`: Each phrase is associated with a unique stock ticker symbol, identifying the company mentioned or described in the phrase.
## Primary Use Case
**Ticker Extraction from Natural Language Queries**: The main application of this dataset is to train models that can accurately identify and extract stock ticker symbols from text. This capability is crucial for automating the analysis of financial news, social media mentions, analyst reports, and any textual content where companies are discussed without directly mentioning their ticker symbols.
## Getting Started
To begin working with the phrase-ticker Dataset in your projects, you can load it using the Hugging Face `datasets` library:
```python
from datasets import load_dataset
dataset = load_dataset("rohanmahen/phrase-ticker")
```
## Contributions
Contributions to the phrase-ticker Dataset are welcomed, including the addition of new phrases, refinement of existing data, and suggestions for improvement. Please checkout the repository on [github](https://github.com/rohanmahen/phrase-ticker) for more info.
| rohanmahen/phrase-ticker | [
"license:mit",
"region:us"
] | 2024-02-17T12:04:15+00:00 | {"license": "mit"} | 2024-02-17T12:55:53+00:00 |
103a7209e484f5b8c31c4584fa48736a8f8e1eba | maghwa/OpenHermes-2-AR-10K-29-710k-720k | [
"region:us"
] | 2024-02-17T12:04:28+00:00 | {"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "hash", "dtype": "null"}, {"name": "category", "dtype": "null"}, {"name": "system_prompt", "dtype": "null"}, {"name": "model_name", "dtype": "null"}, {"name": "language", "dtype": "null"}, {"name": "views", "dtype": "float64"}, {"name": "conversations", "dtype": "string"}, {"name": "topic", "dtype": "null"}, {"name": "id", "dtype": "null"}, {"name": "avatarUrl", "dtype": "null"}, {"name": "custom_instruction", "dtype": "null"}, {"name": "skip_prompt_formatting", "dtype": "null"}, {"name": "idx", "dtype": "null"}, {"name": "title", "dtype": "null"}, {"name": "model", "dtype": "null"}], "splits": [{"name": "train", "num_bytes": 25440345, "num_examples": 10001}], "download_size": 11528460, "dataset_size": 25440345}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T12:04:29+00:00 |
|
95c7b548d60917dc408a8569558da290424165ac | kyujinpy/KoCommercial-Dataset | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2024-02-17T12:11:56+00:00 | {"license": "cc-by-nc-sa-4.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9585708419, "num_examples": 1826703}], "download_size": 4024941178, "dataset_size": 9585708419}} | 2024-02-17T17:11:11+00:00 |
|
2a3b3406e9ef2fe616c0cd1ccf1d47193f0fbeb4 |
 | PotatoOff/Milo | [
"task_categories:text-generation",
"task_categories:question-answering",
"language:en",
"region:us"
] | 2024-02-17T12:15:22+00:00 | {"language": ["en"], "task_categories": ["text-generation", "question-answering"]} | 2024-02-17T12:15:56+00:00 |
c6fcab91636b451af464d9b46f7dfedd24224139 | 0x7o/RussianVibe-data | [
"region:us"
] | 2024-02-17T12:29:32+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "caption", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3735780097.615, "num_examples": 3497}], "download_size": 4135366884, "dataset_size": 3735780097.615}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T12:32:59+00:00 |
|
75ca409bcce2e97ab67213b7c6143788fae071c0 | kkknikodem1/oskar | [
"region:us"
] | 2024-02-17T12:48:01+00:00 | {} | 2024-02-17T12:48:01+00:00 |
|
0d1c3b840c0ef8696f6f2fe7d4410279054fae0e | goodbee1/ttttttt | [
"region:us"
] | 2024-02-17T12:51:33+00:00 | {} | 2024-02-17T12:51:33+00:00 |
|
9586281859956244356969a0679230de7a3da68f | lab42/cov-json-vqa | [
"region:us"
] | 2024-02-17T12:55:58+00:00 | {"dataset_info": {"features": [{"name": "image_0", "dtype": "image"}, {"name": "image_1", "dtype": "image"}, {"name": "image_2", "dtype": "image"}, {"name": "images_rest", "sequence": "image"}, {"name": "mask_0", "dtype": "image"}, {"name": "mask_1", "dtype": "image"}, {"name": "mask_2", "dtype": "image"}, {"name": "masks_rest", "sequence": "image"}, {"name": "conversations", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "dataset", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "n_images", "dtype": "int32"}, {"name": "n_masks", "dtype": "int32"}, {"name": "n_conversations", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 45674133.5, "num_examples": 25210}, {"name": "validation", "num_bytes": 4944744.25, "num_examples": 2729}], "download_size": 9198559, "dataset_size": 50618877.75}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-02-17T12:56:07+00:00 |
|
16a487910a8f1e3380051b8010e6affec9b27420 | holikopii/holikopii | [
"region:us"
] | 2024-02-17T13:02:11+00:00 | {} | 2024-02-17T13:02:11+00:00 |
|
09e0ccbbdd7909573be28f8131fd037e9a3f11e4 |
# Dataset Card for Evaluation run of cloudyu/Mixtral_13B_Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Mixtral_13B_Chat](https://huggingface.co/cloudyu/Mixtral_13B_Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T13:01:58.551979](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat/blob/main/results_2024-02-17T13-01-58.551979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6471647618048562,
"acc_stderr": 0.03218980683733778,
"acc_norm": 0.6495327471727932,
"acc_norm_stderr": 0.032835191770398446,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.5897994402086952,
"mc2_stderr": 0.015625316517181305
},
"harness|arc:challenge|25": {
"acc": 0.64419795221843,
"acc_stderr": 0.01399057113791876,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693247
},
"harness|hellaswag|10": {
"acc": 0.6725751842262497,
"acc_stderr": 0.004683146373232271,
"acc_norm": 0.8586934873531169,
"acc_norm_stderr": 0.0034762555096445303
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829193,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.01366423099583483,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.01366423099583483
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137894,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303956,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.5897994402086952,
"mc2_stderr": 0.015625316517181305
},
"harness|winogrande|5": {
"acc": 0.8042620363062352,
"acc_stderr": 0.011151145042218324
},
"harness|gsm8k|5": {
"acc": 0.5663381349507203,
"acc_stderr": 0.013650728047064685
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat | [
"region:us"
] | 2024-02-17T13:04:13+00:00 | {"pretty_name": "Evaluation run of cloudyu/Mixtral_13B_Chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/Mixtral_13B_Chat](https://huggingface.co/cloudyu/Mixtral_13B_Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T13:01:58.551979](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat/blob/main/results_2024-02-17T13-01-58.551979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6471647618048562,\n \"acc_stderr\": 0.03218980683733778,\n \"acc_norm\": 0.6495327471727932,\n \"acc_norm_stderr\": 0.032835191770398446,\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.5897994402086952,\n \"mc2_stderr\": 0.015625316517181305\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.64419795221843,\n \"acc_stderr\": 0.01399057113791876,\n \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693247\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6725751842262497,\n \"acc_stderr\": 0.004683146373232271,\n \"acc_norm\": 0.8586934873531169,\n \"acc_norm_stderr\": 0.0034762555096445303\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829193,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829193\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.01366423099583483,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.01366423099583483\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n \"acc_stderr\": 0.012738547371303956,\n \"acc_norm\": 0.46479791395045633,\n \"acc_norm_stderr\": 0.012738547371303956\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.5897994402086952,\n \"mc2_stderr\": 0.015625316517181305\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218324\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5663381349507203,\n \"acc_stderr\": 0.013650728047064685\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/Mixtral_13B_Chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|arc:challenge|25_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|gsm8k|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hellaswag|10_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T13-01-58.551979.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["**/details_harness|winogrande|5_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T13-01-58.551979.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T13_01_58.551979", "path": ["results_2024-02-17T13-01-58.551979.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T13-01-58.551979.parquet"]}]}]} | 2024-02-17T13:04:34+00:00 |
55ded5c57fb947052f9f5afa3327b57a59433b18 | benayas/massive_augmented_10pct_v1 | [
"region:us"
] | 2024-02-17T13:07:56+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "locale", "dtype": "string"}, {"name": "partition", "dtype": "string"}, {"name": "scenario", "dtype": "float64"}, {"name": "intent", "dtype": "float64"}, {"name": "utt", "dtype": "string"}, {"name": "annot_utt", "dtype": "string"}, {"name": "worker_id", "dtype": "string"}, {"name": "slot_method", "struct": [{"name": "method", "sequence": "null"}, {"name": "slot", "sequence": "null"}]}, {"name": "judgments", "struct": [{"name": "grammar_score", "sequence": "int8"}, {"name": "intent_score", "sequence": "int8"}, {"name": "language_identification", "sequence": "null"}, {"name": "slots_score", "sequence": "int8"}, {"name": "spelling_score", "sequence": "int8"}, {"name": "worker_id", "sequence": "null"}]}, {"name": "category", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1691763, "num_examples": 11514}], "download_size": 428870, "dataset_size": 1691763}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T13:07:58+00:00 |
|
9ba7d4a4ee7294d33d16aba3387436bf617cb996 | itsvivek/imagenet-1k | [
"region:us"
] | 2024-02-17T13:09:13+00:00 | {} | 2024-02-17T13:09:13+00:00 |
|
e687705166667d226aee58fb783f7587b971e49c | maghwa/OpenHermes-2-AR-10K-30-720k-730k | [
"region:us"
] | 2024-02-17T13:10:13+00:00 | {"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "hash", "dtype": "null"}, {"name": "category", "dtype": "null"}, {"name": "system_prompt", "dtype": "null"}, {"name": "model_name", "dtype": "null"}, {"name": "language", "dtype": "null"}, {"name": "views", "dtype": "float64"}, {"name": "conversations", "dtype": "string"}, {"name": "topic", "dtype": "null"}, {"name": "id", "dtype": "null"}, {"name": "avatarUrl", "dtype": "null"}, {"name": "custom_instruction", "dtype": "null"}, {"name": "skip_prompt_formatting", "dtype": "null"}, {"name": "idx", "dtype": "null"}, {"name": "title", "dtype": "null"}, {"name": "model", "dtype": "null"}], "splits": [{"name": "train", "num_bytes": 25337285, "num_examples": 10001}], "download_size": 11474223, "dataset_size": 25337285}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T13:10:14+00:00 |
|
4673330aa463021368d2df5be17c29c8b4f2431d | ddahlmeier/sutd_instruct | [
"region:us"
] | 2024-02-17T13:24:51+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 99626, "num_examples": 30}], "download_size": 45451, "dataset_size": 99626}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T15:19:01+00:00 |
|
974623f6718e1fdbd75f7c83b2f858cf85b65d5e | # Dataset Card for "RefGPT-Reason"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Mutonix/RefGPT-Reason | [
"region:us"
] | 2024-02-17T13:31:17+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answers", "sequence": "string"}, {"name": "best_answer", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "type", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 351127497, "num_examples": 288285}], "download_size": 210897840, "dataset_size": 351127497}} | 2024-02-17T13:32:56+00:00 |
bc390756b7c49e353bc1f4c12a75fb10a7261eeb | hbx/IN3 | [
"license:apache-2.0",
"region:us"
] | 2024-02-17T13:31:22+00:00 | {"license": "apache-2.0"} | 2024-02-17T13:47:55+00:00 |
|
139fa20f189383bb5978a736274cf17ea7cc864f | Kalfrin/camus | [
"license:openrail",
"region:us"
] | 2024-02-17T13:38:16+00:00 | {"license": "openrail"} | 2024-02-17T13:38:53+00:00 |
|
7c73af7e30499b0ff3eefbf8b88eca92fd1a2960 | # Dataset Card for "vikwiki-qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | boapps/vikwiki-qa | [
"region:us"
] | 2024-02-17T13:44:14+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answers", "sequence": "string"}, {"name": "correct_answer", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1267180.5238698774, "num_examples": 5325}, {"name": "test", "num_bytes": 422631.4761301225, "num_examples": 1776}], "download_size": 983857, "dataset_size": 1689812.0}} | 2024-02-17T13:44:20+00:00 |
bfaf34347c4f74efe93807194962edc9ed4c5156 |
# Dataset of conte_di_cavour/コンテ·ディ·カブール/加富尔伯爵 (Azur Lane)
This is the dataset of conte_di_cavour/コンテ·ディ·カブール/加富尔伯爵 (Azur Lane), containing 9 images and their tags.
The core tags of this character are `grey_hair, yellow_eyes, short_hair, bangs, beret, hat, black_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:---------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 9 | 6.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 9 | 5.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 13 | 7.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 9 | 6.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 13 | 8.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/conte_di_cavour_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, solo, weapon, black_thighhighs, closed_mouth, holding, looking_at_viewer, white_gloves, cape, full_body, simple_background, water, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | weapon | black_thighhighs | closed_mouth | holding | looking_at_viewer | white_gloves | cape | full_body | simple_background | water | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------|:-------------------|:---------------|:----------|:--------------------|:---------------|:-------|:------------|:--------------------|:--------|:-------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/conte_di_cavour_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T13:50:29+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T14:04:47+00:00 |
f9ea974443459643a40e834cb3d75c22ba3848f5 | Diiiann/ossetian_lang | [
"region:us"
] | 2024-02-17T14:00:13+00:00 | {} | 2024-02-17T14:00:13+00:00 |
|
76c5d7ce9a21002bf989a3a3a0802f395c361f4f | Diiiann/ossetian-russian | [
"region:us"
] | 2024-02-17T14:00:59+00:00 | {"dataset_info": {"features": [{"name": "oss", "dtype": "string"}, {"name": "ru", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 373189, "num_examples": 141}], "download_size": 189545, "dataset_size": 373189}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T14:01:00+00:00 |
|
4db4eedcf9ae6fc375c9199c93fe38a73b9481d2 | # Dataset Card for "ytrends5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | firstgradeai/ytrends5 | [
"region:us"
] | 2024-02-17T14:02:11+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13395866.378974993, "num_examples": 9069}, {"name": "test", "num_bytes": 5741507.621025008, "num_examples": 3887}], "download_size": 9624751, "dataset_size": 19137374.0}} | 2024-02-17T14:05:09+00:00 |
a3437118039be6e62a2605ce93900fb29ea6d4cd | lab42/cov-json-vqa-10-v3 | [
"region:us"
] | 2024-02-17T14:02:48+00:00 | {"dataset_info": {"features": [{"name": "image_0", "dtype": "image"}, {"name": "image_1", "dtype": "image"}, {"name": "image_2", "dtype": "image"}, {"name": "images_rest", "sequence": "image"}, {"name": "mask_0", "dtype": "image"}, {"name": "mask_1", "dtype": "image"}, {"name": "mask_2", "dtype": "image"}, {"name": "masks_rest", "sequence": "image"}, {"name": "conversations", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "dataset", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "n_images", "dtype": "int32"}, {"name": "n_masks", "dtype": "int32"}, {"name": "n_conversations", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 16780.0, "num_examples": 9}, {"name": "validation", "num_bytes": 1876.0, "num_examples": 1}], "download_size": 43039, "dataset_size": 18656.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-02-17T14:02:54+00:00 |
|
fd2fb64e5850c62a0a1ce26acc8fa0c263bee9df | looper525/mini-platypus | [
"region:us"
] | 2024-02-17T14:06:00+00:00 | {"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "data_source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 30776452, "num_examples": 24926}], "download_size": 15552844, "dataset_size": 30776452}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T14:06:02+00:00 |
|
e885bc2e0ef4d10274628f38850d8dc0f65952a8 | dakkulanthu/news | [
"region:us"
] | 2024-02-17T14:08:30+00:00 | {} | 2024-02-17T14:21:27+00:00 |
|
e012e15427ab8b4cc996bb99228a89a133c3e79a | lab42/cov-json-vqa-v3 | [
"region:us"
] | 2024-02-17T14:26:21+00:00 | {"dataset_info": {"features": [{"name": "image_0", "dtype": "image"}, {"name": "image_1", "dtype": "image"}, {"name": "image_2", "dtype": "image"}, {"name": "images_rest", "sequence": "image"}, {"name": "mask_0", "dtype": "image"}, {"name": "mask_1", "dtype": "image"}, {"name": "mask_2", "dtype": "image"}, {"name": "masks_rest", "sequence": "image"}, {"name": "conversations", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "dataset", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "n_images", "dtype": "int32"}, {"name": "n_masks", "dtype": "int32"}, {"name": "n_conversations", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 55013986.5, "num_examples": 25210}, {"name": "validation", "num_bytes": 5952357.25, "num_examples": 2729}], "download_size": 19882115, "dataset_size": 60966343.75}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-02-17T14:26:29+00:00 |
|
d1bfb74f3331f6fa46c48e519ee07b2e5be289f5 | maghwa/OpenHermes-2-AR-10K-31-730k-740k | [
"region:us"
] | 2024-02-17T14:28:08+00:00 | {"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "hash", "dtype": "null"}, {"name": "category", "dtype": "null"}, {"name": "system_prompt", "dtype": "null"}, {"name": "model_name", "dtype": "null"}, {"name": "language", "dtype": "null"}, {"name": "views", "dtype": "float64"}, {"name": "conversations", "dtype": "string"}, {"name": "topic", "dtype": "null"}, {"name": "id", "dtype": "null"}, {"name": "avatarUrl", "dtype": "null"}, {"name": "custom_instruction", "dtype": "null"}, {"name": "skip_prompt_formatting", "dtype": "null"}, {"name": "idx", "dtype": "null"}, {"name": "title", "dtype": "null"}, {"name": "model", "dtype": "null"}], "splits": [{"name": "train", "num_bytes": 25172981, "num_examples": 10001}], "download_size": 11410894, "dataset_size": 25172981}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T14:28:09+00:00 |
|
ce6e3ed267388f8725bff3b07a7bb18ec475276f |
# Dataset Card for Evaluation run of DatPySci/pythia-1b-sft-50k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DatPySci/pythia-1b-sft-50k](https://huggingface.co/DatPySci/pythia-1b-sft-50k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DatPySci__pythia-1b-sft-50k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T14:41:59.887810](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-sft-50k/blob/main/results_2024-02-17T14-41-59.887810.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24467360878008643,
"acc_stderr": 0.03024534477539282,
"acc_norm": 0.24555697815658845,
"acc_norm_stderr": 0.030978837434188194,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.01450904517148729,
"mc2": 0.37005968856579075,
"mc2_stderr": 0.014337009699291485
},
"harness|arc:challenge|25": {
"acc": 0.27986348122866894,
"acc_stderr": 0.01311904089772592,
"acc_norm": 0.3003412969283277,
"acc_norm_stderr": 0.013395909309957009
},
"harness|hellaswag|10": {
"acc": 0.3906592312288389,
"acc_stderr": 0.00486901015228075,
"acc_norm": 0.4910376419040032,
"acc_norm_stderr": 0.004988979750014438
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322716,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322716
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123394,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123394
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.02749566368372407,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.02749566368372407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.02767845257821238,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.02767845257821238
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.038924311065187504,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.038924311065187504
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2,
"acc_stderr": 0.0333333333333333,
"acc_norm": 0.2,
"acc_norm_stderr": 0.0333333333333333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.039325376803928704,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.039325376803928704
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055952,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055952
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1717171717171717,
"acc_stderr": 0.026869716187429917,
"acc_norm": 0.1717171717171717,
"acc_norm_stderr": 0.026869716187429917
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2694300518134715,
"acc_stderr": 0.03201867122877795,
"acc_norm": 0.2694300518134715,
"acc_norm_stderr": 0.03201867122877795
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2230769230769231,
"acc_stderr": 0.021107730127244,
"acc_norm": 0.2230769230769231,
"acc_norm_stderr": 0.021107730127244
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882385,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882385
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567978,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567978
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.30275229357798167,
"acc_stderr": 0.019698711434756343,
"acc_norm": 0.30275229357798167,
"acc_norm_stderr": 0.019698711434756343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24537037037037038,
"acc_stderr": 0.02934666509437295,
"acc_norm": 0.24537037037037038,
"acc_norm_stderr": 0.02934666509437295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.20098039215686275,
"acc_stderr": 0.028125972265654355,
"acc_norm": 0.20098039215686275,
"acc_norm_stderr": 0.028125972265654355
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035296,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26947637292464877,
"acc_stderr": 0.015866243073215037,
"acc_norm": 0.26947637292464877,
"acc_norm_stderr": 0.015866243073215037
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654555,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767857,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888156,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432403,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432403
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34558823529411764,
"acc_stderr": 0.028888193103988644,
"acc_norm": 0.34558823529411764,
"acc_norm_stderr": 0.028888193103988644
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.018185218954318086,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.018185218954318086
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984926,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984926
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.19883040935672514,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.19883040935672514,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.01450904517148729,
"mc2": 0.37005968856579075,
"mc2_stderr": 0.014337009699291485
},
"harness|winogrande|5": {
"acc": 0.5406471981057617,
"acc_stderr": 0.014005973823825136
},
"harness|gsm8k|5": {
"acc": 0.017437452615617893,
"acc_stderr": 0.003605486867998255
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_DatPySci__pythia-1b-sft-50k | [
"region:us"
] | 2024-02-17T14:29:57+00:00 | {"pretty_name": "Evaluation run of DatPySci/pythia-1b-sft-50k", "dataset_summary": "Dataset automatically created during the evaluation run of model [DatPySci/pythia-1b-sft-50k](https://huggingface.co/DatPySci/pythia-1b-sft-50k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DatPySci__pythia-1b-sft-50k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T14:41:59.887810](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-sft-50k/blob/main/results_2024-02-17T14-41-59.887810.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24467360878008643,\n \"acc_stderr\": 0.03024534477539282,\n \"acc_norm\": 0.24555697815658845,\n \"acc_norm_stderr\": 0.030978837434188194,\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.01450904517148729,\n \"mc2\": 0.37005968856579075,\n \"mc2_stderr\": 0.014337009699291485\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.27986348122866894,\n \"acc_stderr\": 0.01311904089772592,\n \"acc_norm\": 0.3003412969283277,\n \"acc_norm_stderr\": 0.013395909309957009\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3906592312288389,\n \"acc_stderr\": 0.00486901015228075,\n \"acc_norm\": 0.4910376419040032,\n \"acc_norm_stderr\": 0.004988979750014438\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322716,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322716\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123394,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123394\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.02749566368372407,\n \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.02749566368372407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.02767845257821238,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.02767845257821238\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.038924311065187504,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.038924311065187504\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.0333333333333333,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.0333333333333333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.039325376803928704,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.039325376803928704\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055952,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055952\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.1717171717171717,\n \"acc_stderr\": 0.026869716187429917,\n \"acc_norm\": 0.1717171717171717,\n \"acc_norm_stderr\": 0.026869716187429917\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.2694300518134715,\n \"acc_stderr\": 0.03201867122877795,\n \"acc_norm\": 0.2694300518134715,\n \"acc_norm_stderr\": 0.03201867122877795\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2230769230769231,\n \"acc_stderr\": 0.021107730127244,\n \"acc_norm\": 0.2230769230769231,\n \"acc_norm_stderr\": 0.021107730127244\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882385,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882385\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567978,\n \"acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567978\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.30275229357798167,\n \"acc_stderr\": 0.019698711434756343,\n \"acc_norm\": 0.30275229357798167,\n \"acc_norm_stderr\": 0.019698711434756343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.24537037037037038,\n \"acc_stderr\": 0.02934666509437295,\n \"acc_norm\": 0.24537037037037038,\n \"acc_norm_stderr\": 0.02934666509437295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.20098039215686275,\n \"acc_stderr\": 0.028125972265654355,\n \"acc_norm\": 0.20098039215686275,\n \"acc_norm_stderr\": 0.028125972265654355\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035296,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035296\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.336322869955157,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.03192193448934724,\n \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.03192193448934724\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26947637292464877,\n \"acc_stderr\": 0.015866243073215037,\n \"acc_norm\": 0.26947637292464877,\n \"acc_norm_stderr\": 0.015866243073215037\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654555,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654555\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.014355911964767857,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.014355911964767857\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888156,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888156\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432403,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432403\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.34558823529411764,\n \"acc_stderr\": 0.028888193103988644,\n \"acc_norm\": 0.34558823529411764,\n \"acc_norm_stderr\": 0.028888193103988644\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.28104575163398693,\n \"acc_stderr\": 0.018185218954318086,\n \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.018185218954318086\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n \"acc_stderr\": 0.03764425585984926,\n \"acc_norm\": 0.19090909090909092,\n \"acc_norm_stderr\": 0.03764425585984926\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.19883040935672514,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.19883040935672514,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.01450904517148729,\n \"mc2\": 0.37005968856579075,\n \"mc2_stderr\": 0.014337009699291485\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5406471981057617,\n \"acc_stderr\": 0.014005973823825136\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.017437452615617893,\n \"acc_stderr\": 0.003605486867998255\n }\n}\n```", "repo_url": "https://huggingface.co/DatPySci/pythia-1b-sft-50k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|arc:challenge|25_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|arc:challenge|25_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|arc:challenge|25_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|gsm8k|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|gsm8k|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|gsm8k|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hellaswag|10_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hellaswag|10_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hellaswag|10_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T14-28-13.386969.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T14-28-30.118401.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T14-41-59.887810.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["**/details_harness|winogrande|5_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["**/details_harness|winogrande|5_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["**/details_harness|winogrande|5_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T14-41-59.887810.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T14_28_13.386969", "path": ["results_2024-02-17T14-28-13.386969.parquet"]}, {"split": "2024_02_17T14_28_30.118401", "path": ["results_2024-02-17T14-28-30.118401.parquet"]}, {"split": "2024_02_17T14_41_59.887810", "path": ["results_2024-02-17T14-41-59.887810.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T14-41-59.887810.parquet"]}]}]} | 2024-02-17T14:43:48+00:00 |
46bcf8001d07d4295361c788ee8754aa57254492 | Orenbac/amz-press-release_summarized_embedded | [
"region:us"
] | 2024-02-17T14:30:49+00:00 | {} | 2024-02-17T14:32:21+00:00 |
|
6711d2f2a9c7d585e61ecee5138c3252d3367fb1 |
## Sources of the corpus used
- [Opus](https://opus.nlpl.eu/results/en&eu/corpus-result-table)
- [Orai](https://www.orai.eus/en/resources)
| itzune/basque-parallel-corpus | [
"task_categories:translation",
"language:eu",
"language:en",
"language:es",
"language:fr",
"region:us"
] | 2024-02-17T14:36:43+00:00 | {"language": ["eu", "en", "es", "fr"], "task_categories": ["translation"], "pretty_name": "Parallel Basque Corpus"} | 2024-02-17T14:52:38+00:00 |
4878482e839a5a71bd03f49b755f33916d657a57 | Weni/zeroshot-validation-3.1.0 | [
"region:us"
] | 2024-02-17T14:40:23+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "all_classes", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "language", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 970707, "num_examples": 1000}], "download_size": 95553, "dataset_size": 970707}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T15:16:39+00:00 |
|
b4020ab3a4b832f42e8b226e9be64efb7f3b8f43 |
# Dataset Card for Evaluation run of athirdpath/Orca-2-13b-Alpaca-Uncensored
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [athirdpath/Orca-2-13b-Alpaca-Uncensored](https://huggingface.co/athirdpath/Orca-2-13b-Alpaca-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T14:58:45.053086](https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored/blob/main/results_2024-02-17T14-58-45.053086.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6002703838763287,
"acc_stderr": 0.032941926494755115,
"acc_norm": 0.6047089287996544,
"acc_norm_stderr": 0.03361522848210774,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5358987502340753,
"mc2_stderr": 0.01565060464007792
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520769,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.6100378410675165,
"acc_stderr": 0.004867445945277156,
"acc_norm": 0.792670782712607,
"acc_norm_stderr": 0.004045648954769832
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.036117805602848975,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.036117805602848975
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764805,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764805
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630642,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630642
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710855,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710855
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936073,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936073
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764377,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764377
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153186,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153186
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187306,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187306
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388852,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388852
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900933,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900933
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567657,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567657
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.01969145905235402,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.01969145905235402
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5358987502340753,
"mc2_stderr": 0.01565060464007792
},
"harness|winogrande|5": {
"acc": 0.7742699289660616,
"acc_stderr": 0.011749626260902545
},
"harness|gsm8k|5": {
"acc": 0.38286580742987114,
"acc_stderr": 0.013389223491820465
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored | [
"region:us"
] | 2024-02-17T15:01:02+00:00 | {"pretty_name": "Evaluation run of athirdpath/Orca-2-13b-Alpaca-Uncensored", "dataset_summary": "Dataset automatically created during the evaluation run of model [athirdpath/Orca-2-13b-Alpaca-Uncensored](https://huggingface.co/athirdpath/Orca-2-13b-Alpaca-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T14:58:45.053086](https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored/blob/main/results_2024-02-17T14-58-45.053086.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6002703838763287,\n \"acc_stderr\": 0.032941926494755115,\n \"acc_norm\": 0.6047089287996544,\n \"acc_norm_stderr\": 0.03361522848210774,\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5358987502340753,\n \"mc2_stderr\": 0.01565060464007792\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520769,\n \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6100378410675165,\n \"acc_stderr\": 0.004867445945277156,\n \"acc_norm\": 0.792670782712607,\n \"acc_norm_stderr\": 0.004045648954769832\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.036117805602848975,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.036117805602848975\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n \"acc_stderr\": 0.024993053397764805,\n \"acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.024993053397764805\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630642,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630642\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710855,\n \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710855\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.031429466378837076,\n \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.031429466378837076\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936073,\n \"acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936073\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764377,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764377\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384493,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384493\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153186,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153186\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n \"acc_stderr\": 0.015748421208187306,\n \"acc_norm\": 0.3318435754189944,\n \"acc_norm_stderr\": 0.015748421208187306\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388852,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388852\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900933,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900933\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n \"acc_stderr\": 0.012671902782567657,\n \"acc_norm\": 0.4380704041720991,\n \"acc_norm_stderr\": 0.012671902782567657\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.01969145905235402,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.01969145905235402\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5358987502340753,\n \"mc2_stderr\": 0.01565060464007792\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902545\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38286580742987114,\n \"acc_stderr\": 0.013389223491820465\n }\n}\n```", "repo_url": "https://huggingface.co/athirdpath/Orca-2-13b-Alpaca-Uncensored", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|arc:challenge|25_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|gsm8k|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hellaswag|10_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T14-58-45.053086.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["**/details_harness|winogrande|5_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T14-58-45.053086.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T14_58_45.053086", "path": ["results_2024-02-17T14-58-45.053086.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T14-58-45.053086.parquet"]}]}]} | 2024-02-17T15:01:23+00:00 |
53dd169c863af510885a96925c6f95f0a02aba77 | Guilherme34/Pygmalion-dataset | [
"region:us"
] | 2024-02-17T15:08:21+00:00 | {} | 2024-02-17T15:09:42+00:00 |
|
62888c073f7a0b16e25e84852b46229af80e3f89 | Weni/zeroshot-3.1.0 | [
"region:us"
] | 2024-02-17T15:09:41+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "all_classes", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "language", "dtype": {"class_label": {"names": {"0": "pt", "1": "en", "2": "es"}}}}], "splits": [{"name": "train", "num_bytes": 18334765, "num_examples": 29448}], "download_size": 8387033, "dataset_size": 18334765}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T15:10:56+00:00 |
|
a27f09c7090519d25f10ce86aa7a9bb7a70edc22 | ANDRIZZBIRCH/arianagrandesweetner | [
"task_categories:text-classification",
"language:aa",
"license:apache-2.0",
"region:us"
] | 2024-02-17T15:09:46+00:00 | {"language": ["aa"], "license": "apache-2.0", "task_categories": ["text-classification"]} | 2024-02-17T15:14:44+00:00 |
|
a9197c003f4483b63083a474759a03a42d213f02 | alinet/balanced_qg | [
"region:us"
] | 2024-02-17T15:12:53+00:00 | {} | 2024-02-17T15:13:40+00:00 |
|
6c57624d94cdec5af451133a8430797d435542d8 | benayas/massive_augmented_20pct_v1 | [
"region:us"
] | 2024-02-17T15:17:04+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "locale", "dtype": "string"}, {"name": "partition", "dtype": "string"}, {"name": "scenario", "dtype": "float64"}, {"name": "intent", "dtype": "float64"}, {"name": "utt", "dtype": "string"}, {"name": "annot_utt", "dtype": "string"}, {"name": "worker_id", "dtype": "string"}, {"name": "slot_method", "struct": [{"name": "method", "sequence": "null"}, {"name": "slot", "sequence": "null"}]}, {"name": "judgments", "struct": [{"name": "grammar_score", "sequence": "int8"}, {"name": "intent_score", "sequence": "int8"}, {"name": "language_identification", "sequence": "null"}, {"name": "slots_score", "sequence": "int8"}, {"name": "spelling_score", "sequence": "int8"}, {"name": "worker_id", "sequence": "null"}]}, {"name": "category", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1758967, "num_examples": 11514}], "download_size": 472580, "dataset_size": 1758967}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T15:17:06+00:00 |
|
59fb34ca7c3f3c613c5c24ebe5a80fbb53e06ae0 | mb7419/legal-advice-reddit | [
"region:us"
] | 2024-02-17T15:21:12+00:00 | {"dataset_info": {"features": [{"name": "ques_title", "dtype": "string"}, {"name": "ques_text", "dtype": "string"}, {"name": "ques_created", "dtype": "string"}, {"name": "ques_score", "dtype": "int64"}, {"name": "ans_text", "dtype": "string"}, {"name": "ans_created", "dtype": "string"}, {"name": "ans_score", "dtype": "float64"}, {"name": "dominant_topic_name", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 228451850, "num_examples": 115359}, {"name": "validation", "num_bytes": 49046245, "num_examples": 24720}, {"name": "test", "num_bytes": 49058903, "num_examples": 24720}], "download_size": 197856704, "dataset_size": 326556998}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-17T15:21:22+00:00 |
|
33a5335cb3c30d270cb984f980894c95bbe1fc73 | 1. edjudgement.zip consist of originally scrapped pdf files
2. pdf_to_text_raw.csv is a processed data (stage 1 - non refined) from the scraped pdf | izardy/malaysia-ejudgement | [
"region:us"
] | 2024-02-17T15:28:45+00:00 | {} | 2024-02-17T16:01:56+00:00 |
e68c1a09560c9f8359873f74fd5db534f08dc00b | hannademaria/ARVR | [
"region:us"
] | 2024-02-17T15:31:30+00:00 | {} | 2024-02-17T15:31:30+00:00 |
|
4570667d64bcd1741501c700899b11f321198570 | # Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI)
## Project Overview
The Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI) is committed to utilizing the Trees & Planting Sites dataset for a comprehensive geospatial analysis of Durham's urban tree canopy. Through Python within Google Colab, our aim is to identify key locations for canopy expansion, evaluate the impact of urban development on green spaces, and deliver informed recommendations for the sustainable growth of urban tree coverage.
## Background and Rationale
Durham's urban tree canopy is a crucial component that contributes to environmental quality, public health, and overall city aesthetics. This canopy is under threat due to ongoing urban development and natural wear. A systematic, data-driven approach is critical for strategic planning and conservation of the urban forest to ensure its vitality for generations to come.
## Data Sources and Methodology
### Data Sources
We will leverage the following files from the Durham Trees & Planting Sites Dataset, as found on the Durham Open Data portal:
- `GS_TreeInventory.shp`
- `Trees_&_Planting_Sites.csv`
- `Trees_%26_Planting_Sites.geojson`
# Dataset Card for Urban Tree Inventory
## Dataset Description
This dataset provides comprehensive information about urban trees within a specified area, including their physical characteristics, environmental benefits, and the economic value they add in terms of ecosystem services.
### Spatial Data (GeoJSON)
**Format:** GeoJSON
**Content:**
- **Type:** `FeatureCollection` - A collection of feature objects.
- **Features:** Each feature object represents a tree and contains:
- **Type:** `Feature`
- **Geometry:** `Point` (includes longitude and latitude of the tree location).
- **Properties:** Detailed information about the tree (some fields may overlap with the CSV structure below).
### Tabular Data (CSV)
**Format:** CSV
**Columns:**
- **X, Y:** Coordinates of the tree location.
- **OBJECTID:** Unique identifier for the tree.
- **streetaddress:** Street address nearest to the tree.
- **city:** City where the tree is located.
- **zipcode:** Zip code for the location of the tree.
- **facilityid:** Identifier for the facility associated with the tree, if any.
- **present:** Indication of whether the tree is currently present.
- **genus, species, commonname:** Botanical and common names of the tree.
- **plantingdate:** Date when the tree was planted.
- **diameterin:** Diameter of the tree trunk in inches.
- **heightft:** Height of the tree in feet.
- **condition:** Health condition of the tree.
- **contractwork:** Indicates if the tree has had any contract work done.
- **neighborhood:** Neighborhood where the tree is located.
- **program:** The program under which the tree was planted.
- **plantingw:** Width of the planting site.
- **plantingcond:** Condition of the planting site.
- **underpwerlins:** Whether the tree is under power lines.
- **matureheight:** The mature height of the tree.
- **GlobalID:** A global unique identifier for the tree.
- **created_user:** The user who created the record.
- **created_date:** The date the record was created.
- **last_edited_user:** The user who last edited the record.
- **last_edited_date:** The date the record was last edited.
#### Environmental and Economic Data:
- **isoprene, monoterpene, vocs:** Emissions and absorption data for various compounds.
- **coremoved_ozperyr, o3removed_ozperyr, etc.:** Annual pollutant removal metrics.
- **o2production_lbperyr:** Annual oxygen production.
- **carbonstorage_lb, carbonstorage_dol:** Carbon storage metrics.
- **grosscarseq_lbperyr, grosscarseq_dolperyr:** Gross carbon sequestration.
- **avoidrunoff_ft2peryr, avoidrunoff_dol2peryr:** Metrics related to stormwater runoff avoidance.
- **totannbenefits_dolperyr:** Total annual dollar benefits from the tree.
- **leafarea_sqft, potevapotran_cuftperyr, etc.:** Metrics related to the water cycle.
- **heating_mbtuperyr, cooling_kwhperyr, etc.:** Energy savings related to the tree's impact on building energy use.
### Example Record
**GeoJSON Feature:**
```json
{
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [-78.90863, 36.00441]
},
"properties": {
"OBJECTID": 2840940,
"commonname": "Willow Oak",
// Additional properties...
}
}
```
The `GS_TreeInventory.shp` file encompasses a range of attributes for each record:
- **OBJECTID:** Unique identifier for each record.
- **streetaddr:** Street address where the tree or planting site is located.
- **city:** The city name, which is Durham.
- **zipcode:** Postal code for the location.
- **facilityid:** Identifier possibly linked to a facility or area associated with the tree.
- **present:** Type of feature present, such as a tree or a planting site.
- **genus:** Genus of the tree.
- **species:** Species of the tree.
- **commonname:** Common name of the tree.
- **plantingda:** Date or year range when the tree was planted or the planting site was established.
- ...
### Objectives
1. Combine Shapefile and CSV data into a comprehensive geospatial dataset using Python.
2. Apply Python libraries to uncover relationships between tree canopy data and urban development.
3. Provide practical insights and strategies for the expansion of Durham's urban tree canopy.
4. Produce analyses and visualizations with the GeoJSON file.
### Methodology
Our analytical process within Google Colab will encompass:
- **Data Preparation and Integration:** Using tools like Geopandas, Pandas, and PyShp to organize and combine spatial and tabular data.
- **Geospatial Analysis:** Applying Shapely and Rtree for spatial analysis, and using SciPy or Statsmodels for statistical correlations.
- **Visualization and Optimization:** Generating maps and graphs with Matplotlib, Seaborn, or Plotly, and utilizing optimization algorithms to suggest optimal planting locations.
## Deliverables
1. A collection of Google Colab Python notebooks that outline our analytical processes.
2. Interactive maps and visualizations that connect tree canopy coverage with urban development metrics.
3. An exhaustive report that contains our findings and recommendations for enhancing the urban canopy.
## Limitations
- **Computational Resources:** The limited computational offerings of Google Colab may pose a challenge to the size of the datasets or the complexity of models we can employ.
- **Data Quality:** The accuracy and currency of the data ultimately affect the precision of our recommendations.
- **Sociopolitical Considerations:** Implementation of our data-driven suggestions must be reviewed within the context of local policy and community input.
## Conclusion
DUCAEI aims to create a more verdant and livable urban landscape in Durham through this Python-based analytical project. By laying a strong foundation for data-informed decision-making, we hope to cultivate a thriving, green, and sustainable urban environment. | Ziyuan111/Urban_Tree_Canopy_in_Durham2 | [
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-02-17T15:32:47+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"]} | 2024-02-17T16:36:59+00:00 |
03605da446ad99e0d52506c72536900e270a768f |
<div align="center">
<h1> HALvest </h1>
<h3> Multilingual Research Papers Harvested from HAL </h3>
</div>
## Dataset Description
- **Repository:** [GitHub](https://github.com/Madjakul/HALvesting/tree/main)
- **Papers:** TBD
## Dataset Summary
### overview:
HALvest is dataset comprise of fulltext from open papers from [Hyper Articles en Ligne (HAL)](https://hal.science/). Our dump gather papers written in TBD languages across TBD domains from TBD/date to TBD/date.
You can download the dataset using Hugging Face datasets:
*You may need to follow these instructions to setup authentication before downloading the dataset: [https://huggingface.co/docs/huggingface_hub/quick-start#login](https://huggingface.co/docs/huggingface_hub/quick-start#login)*
```py
from datasets import load_dataset
ds = load_dataset(
"Madjakul/halvest",
"en",
trust_remote_code=True
)
```
### Details:
- We first request [HAL's API](https://api.archives-ouvertes.fr/docs) in order to gather open research papers and parse it -- effectively sorting papers by language, year of production and domain.
- We then download the PDF of the fetched papers.
- Using [GROBID](https://github.com/kermitt2/grobid), we convert each PDF to `xml-tei` format in order to have structured data.
- Using [TBD](https://github.com), we convert each `xml-tei` to LLM's compatible `txt` format and concatenate it with the papaper's metadata fetched earlier.
### Languages
The supported languages and statistics for our dataset can be found below:
TBD
### Domains
The supported domains and statistics for our dataset can be found below:
TBD
### Dataset Structure
```json
{
"halid": ...,
"lang": ...,
"domain": ...,
"timestamp": ...,
"year": ...,
"url": ...,
"text": ...
}
```
## Considerations for Using the Data
HALvest is a direct conversion of reasearch papers found on HAL. Hence, fullnames of individuals, as well as their research e-mail adress can be found within the dataset. This must be considered prior to using this dataset for any purpose, such as training deep learning models, etc.
## Dataset Copyright
The licence terms for HALvest strictly follows those of HAL. Please refer to the below license when using this dataset.
- [HAL license](https://doc.archives-ouvertes.fr/en/legal-aspects/)
## Citation
```bibtex
@proceedings{TBD
}
```
## Acknowledegment
This dataset is built upon the following works:
```
GROBID: A machine learning software for extracting information from scholarly documents
https://github.com/kermitt2/grobid
harvesting: Collection of data parser for harvested data in [...]
[...]
```
| Madjakul/halvest | [
"task_categories:text-generation",
"task_categories:fill-mask",
"academia",
"research",
"region:us"
] | 2024-02-17T15:42:06+00:00 | {"annotations_creators": ["no-annotation"], "multilinguality": ["multilingual"], "source_datasets": ["original"], "task_categories": ["text-generation", "fill-mask"], "task_ids": ["language-modeling", "masked-language-modeling"], "pretty_name": "HALvest", "tags": ["academia", "research"]} | 2024-02-17T15:53:20+00:00 |
afe9f82a5b85b8bfc258d5348d9821a67ab7d45e | mesolitica/noisy-augmentation | [
"region:us"
] | 2024-02-17T15:45:07+00:00 | {} | 2024-02-17T15:47:31+00:00 |
|
b53c97aeaeb9e1b4459299f38b282bd76a3c5586 |
# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v5.6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v5.6](https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T15:44:22.548008](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6/blob/main/results_2024-02-17T15-44-22.548008.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6501937924638518,
"acc_stderr": 0.032025249091365754,
"acc_norm": 0.6494469200952948,
"acc_norm_stderr": 0.03269529478578274,
"mc1": 0.6303549571603427,
"mc1_stderr": 0.01689818070697388,
"mc2": 0.7781424860062839,
"mc2_stderr": 0.013751565023330138
},
"harness|arc:challenge|25": {
"acc": 0.7090443686006825,
"acc_stderr": 0.01327307786590759,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869147
},
"harness|hellaswag|10": {
"acc": 0.7137024497112129,
"acc_stderr": 0.004511063351278702,
"acc_norm": 0.8899621589324835,
"acc_norm_stderr": 0.00312297363203947
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931055,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931055
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993469,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993469
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.016602564615049942,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.016602564615049942
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.012755368722863937,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.012755368722863937
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6303549571603427,
"mc1_stderr": 0.01689818070697388,
"mc2": 0.7781424860062839,
"mc2_stderr": 0.013751565023330138
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433535
},
"harness|gsm8k|5": {
"acc": 0.6967399545109931,
"acc_stderr": 0.012661502663418697
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6 | [
"region:us"
] | 2024-02-17T15:46:42+00:00 | {"pretty_name": "Evaluation run of bardsai/jaskier-7b-dpo-v5.6", "dataset_summary": "Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v5.6](https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T15:44:22.548008](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6/blob/main/results_2024-02-17T15-44-22.548008.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6501937924638518,\n \"acc_stderr\": 0.032025249091365754,\n \"acc_norm\": 0.6494469200952948,\n \"acc_norm_stderr\": 0.03269529478578274,\n \"mc1\": 0.6303549571603427,\n \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7781424860062839,\n \"mc2_stderr\": 0.013751565023330138\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.01327307786590759,\n \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869147\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7137024497112129,\n \"acc_stderr\": 0.004511063351278702,\n \"acc_norm\": 0.8899621589324835,\n \"acc_norm_stderr\": 0.00312297363203947\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931055,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931055\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993469,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993469\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n \"acc_stderr\": 0.016602564615049942,\n \"acc_norm\": 0.4402234636871508,\n \"acc_norm_stderr\": 0.016602564615049942\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n \"acc_stderr\": 0.012755368722863937,\n \"acc_norm\": 0.4758800521512386,\n \"acc_norm_stderr\": 0.012755368722863937\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6303549571603427,\n \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7781424860062839,\n \"mc2_stderr\": 0.013751565023330138\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433535\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6967399545109931,\n \"acc_stderr\": 0.012661502663418697\n }\n}\n```", "repo_url": "https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|arc:challenge|25_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|gsm8k|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hellaswag|10_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T15-44-22.548008.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["**/details_harness|winogrande|5_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T15-44-22.548008.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T15_44_22.548008", "path": ["results_2024-02-17T15-44-22.548008.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T15-44-22.548008.parquet"]}]}]} | 2024-02-17T15:47:04+00:00 |
f425906b3e9e0c339730ce5cfd630c706b601b70 | # Dataset Card for "samantar_with_idx_merged_with_train_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mlsquare/samantar_with_idx_merged_with_train_val | [
"region:us"
] | 2024-02-17T15:52:01+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "idx", "dtype": "int64"}, {"name": "tgt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 11263548923.940233, "num_examples": 79638793}, {"name": "valid", "num_bytes": 2815887337.0597663, "num_examples": 19909699}], "download_size": 9506498914, "dataset_size": 14079436261.0}} | 2024-02-17T16:17:45+00:00 |
2bc2ae985070fc69750b3a8c49f2aefb83dc3428 | saniket919/verbalyze_chat_multilingual | [
"region:us"
] | 2024-02-17T15:54:39+00:00 | {} | 2024-02-17T15:54:39+00:00 |
|
77c688e2b35e0e786a4b7a4bf264cbd514fcd027 | verbalyze/verbalyze_chat_multilingual | [
"region:us"
] | 2024-02-17T15:54:53+00:00 | {} | 2024-02-17T16:12:48+00:00 |
Subsets and Splits