sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
listlengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
listlengths 0
25
| languages
listlengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
listlengths 0
352
| processed_texts
listlengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
bf8e1b2b41aa64258d0c148df7ff26d735356651 | # Dataset Card for "autotrain-data-solar-strike"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Hjallti/autotrain-data-solar-strike | [
"region:us"
]
| 2023-12-06T20:04:14+00:00 | {"dataset_info": {"features": [{"name": "autotrain_text", "dtype": "string"}, {"name": "autotrain_prompt", "dtype": "string"}, {"name": "autotrain_rejected_text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 123976169, "num_examples": 9477}, {"name": "validation", "num_bytes": 123976169, "num_examples": 9477}], "download_size": 75554658, "dataset_size": 247952338}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2023-12-06T20:04:17+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "autotrain-data-solar-strike"
More Information needed | [
"# Dataset Card for \"autotrain-data-solar-strike\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"autotrain-data-solar-strike\"\n\nMore Information needed"
]
|
79c9f1c9cbfd8c991f9bc0f542c2bb844c36392a | # Dataset Card for "fms-bench"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | nayohan/fms-bench-raw | [
"region:us"
]
| 2023-12-06T20:36:54+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "index", "dtype": "int64"}, {"name": "dataID", "dtype": "string"}, {"name": "relationship", "dtype": "string"}, {"name": "time_interval", "sequence": "string"}, {"name": "summary", "sequence": "string"}, {"name": "first_session_dialogue", "sequence": "string"}, {"name": "first_session_speakers", "sequence": "string"}, {"name": "second_session_dialogue", "sequence": "string"}, {"name": "second_session_speakers", "sequence": "string"}, {"name": "third_session_dialogue", "sequence": "string"}, {"name": "third_session_speakers", "sequence": "string"}, {"name": "fourth_session_dialogue", "sequence": "string"}, {"name": "fourth_session_speakers", "sequence": "string"}, {"name": "fifth_session_dialogue", "sequence": "string"}, {"name": "fifth_session_speakers", "sequence": "string"}, {"name": "eval_indicator", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 661372, "num_examples": 80}], "download_size": 352262, "dataset_size": 661372}} | 2023-12-06T20:36:58+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "fms-bench"
More Information needed | [
"# Dataset Card for \"fms-bench\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"fms-bench\"\n\nMore Information needed"
]
|
d4ccc01f84f716c45d49167c7435cd6c9b6c635b | # Dataset Card for "fms-bench"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | nayohan/fms-bench | [
"region:us"
]
| 2023-12-06T20:49:16+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "eval_indicator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 332339, "num_examples": 80}], "download_size": 191705, "dataset_size": 332339}} | 2023-12-09T21:33:32+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "fms-bench"
More Information needed | [
"# Dataset Card for \"fms-bench\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"fms-bench\"\n\nMore Information needed"
]
|
17f764bd6da165874dc433af2437f62ed469a8a0 | # Dataset Card for "advices_llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | fightfei/advices_llama2 | [
"region:us"
]
| 2023-12-06T22:24:39+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 392092356.0, "num_examples": 1430994}, {"name": "test", "num_bytes": 8001896.0, "num_examples": 29204}], "download_size": 46548081, "dataset_size": 400094252.0}} | 2023-12-06T22:25:28+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "advices_llama2"
More Information needed | [
"# Dataset Card for \"advices_llama2\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"advices_llama2\"\n\nMore Information needed"
]
|
60b443c714ea1ba7699f47196da09f0b6a6a4ae3 |
Downloaded using PyTDC (https://tdcommons.ai/).
Contains the unique canonicalized SMILES molecules from MOSES, ZINC-250K, and ChEMBL-29, done with RDKit.
Distribution of tokenized SMILES sequence lengths below. The following regex string was used
to split the SMILES molecule into tokens: (\[[^\]]+]|Br?|Cl?|N|O|S|P|F|I|b|c|n|o|s|p|\(|\)|\.|=|#|-|\+|\\|\/|:|~|@|\?|>>?|\*|\$|\%[0-9]{2}|[0-9])
<img src="violin_allmolgen_cano.png" width=50% height=50%>
Included in the .csv (after extracting the .tar.xz file) is a column "smi_len".
If using the same SMILES tokenization regex string as above, you can simply filter using the values in this column ("smi_len").
I'd recommend post-processing since clearly a majority of the sequences are of a much shorter length than the highest, which is above 1400 (using my regex string). | Pixelatory/AllMolGen | [
"size_categories:1M<n<10M",
"chemistry",
"region:us"
]
| 2023-12-06T22:44:48+00:00 | {"size_categories": ["1M<n<10M"], "tags": ["chemistry"], "configs": [{"config_name": "default", "data_files": "allmolgen.tar.xz"}]} | 2023-12-07T02:08:42+00:00 | []
| []
| TAGS
#size_categories-1M<n<10M #chemistry #region-us
|
Downloaded using PyTDC (URL
Contains the unique canonicalized SMILES molecules from MOSES, ZINC-250K, and ChEMBL-29, done with RDKit.
Distribution of tokenized SMILES sequence lengths below. The following regex string was used
to split the SMILES molecule into tokens: (\[[^\]]+]|Br?|Cl?|N|O|S|P|F|I|b|c|n|o|s|p|\(|\)|\.|=|#|-|\+|\\|\/|:|~|@|\?|>>?|\*|\$|\%[0-9]{2}|[0-9])
<img src="violin_allmolgen_cano.png" width=50% height=50%>
Included in the .csv (after extracting the .URL file) is a column "smi_len".
If using the same SMILES tokenization regex string as above, you can simply filter using the values in this column ("smi_len").
I'd recommend post-processing since clearly a majority of the sequences are of a much shorter length than the highest, which is above 1400 (using my regex string). | []
| [
"TAGS\n#size_categories-1M<n<10M #chemistry #region-us \n"
]
|
f1f9cbc1a7579a8b0afa05e65f94aa7ff9102513 | # Dataset Card for "medical_healthwa_all_2.0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | atom92/medical_healthwa_all_2.0 | [
"region:us"
]
| 2023-12-06T22:48:46+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 45256859, "num_examples": 52105}], "download_size": 19513894, "dataset_size": 45256859}} | 2023-12-06T22:51:40+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "medical_healthwa_all_2.0"
More Information needed | [
"# Dataset Card for \"medical_healthwa_all_2.0\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"medical_healthwa_all_2.0\"\n\nMore Information needed"
]
|
8f82d19c99cb099d029389782d8726ca234facd9 | # Dataset Card for "advices_llama2_2w"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | fightfei/advices_llama2_2w | [
"region:us"
]
| 2023-12-06T23:02:39+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7428021.0, "num_examples": 19599}, {"name": "test", "num_bytes": 151979.0, "num_examples": 401}], "download_size": 661329, "dataset_size": 7580000.0}} | 2023-12-06T23:03:36+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "advices_llama2_2w"
More Information needed | [
"# Dataset Card for \"advices_llama2_2w\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"advices_llama2_2w\"\n\nMore Information needed"
]
|
c7de4b19a869a493a8695994ea2063daa5905599 | # Danbooru SFW 512 General Filter Dataset
This dataset is meant to be used for training a simple binary classifier that can filter the
Danbooru SFW 2021 dataset. There are two classes: "accepted" and "rejected", with "accepted"
representing samples that should pass through the filter and "rejected" representing samples
that should not.
Generallly, to be accepted a sample should meet the following criteria:
1. No horizontal/vertical bars, or bars that can be removed easily by filling in the background
color. This precludes any samples with bars which cut through any part of the image that is not
a solid color block (i.e. patterns, detailed backgrounds, and foreground elements).
2. Not overtly sexual. The SFW subset of Danbooru 2021 which this dataset is based on has
already removed almost all nudity, this criterion simply removes some of the borderline NSFW images.
3. Somewhat aesthetic.
Please note that dataset samples were selected manually, so there was somewhat of a tradeoff between
speed and accuracy. Despite this, I've managed to train a binary classifier up to 85% test
accuracy on this dataset. I've also observed pretty good performance when using that classifier on
entirely new portions of the base dataset.
## Original Dataset Citation
```bibtex
@misc{danbooru2021,
author={Anonymous and Danbooru community and Gwern Branwen},
title={Danbooru2021: A Large-Scale Crowdsourced and Tagged Anime Illustration Dataset},
howpublished={\url{https://gwern.net/danbooru2021}},
url={https://gwern.net/danbooru2021},
type={dataset},
year={2022},
month={January},
timestamp={2022-01-21},
note={Accessed: 2023-12-06}
}
``` | hayden-donnelly/db-sfw-512-general-filter-dataset | [
"task_categories:image-classification",
"size_categories:10K<n<100K",
"region:us"
]
| 2023-12-06T23:24:18+00:00 | {"size_categories": ["10K<n<100K"], "task_categories": ["image-classification"]} | 2024-02-02T00:03:03+00:00 | []
| []
| TAGS
#task_categories-image-classification #size_categories-10K<n<100K #region-us
| # Danbooru SFW 512 General Filter Dataset
This dataset is meant to be used for training a simple binary classifier that can filter the
Danbooru SFW 2021 dataset. There are two classes: "accepted" and "rejected", with "accepted"
representing samples that should pass through the filter and "rejected" representing samples
that should not.
Generallly, to be accepted a sample should meet the following criteria:
1. No horizontal/vertical bars, or bars that can be removed easily by filling in the background
color. This precludes any samples with bars which cut through any part of the image that is not
a solid color block (i.e. patterns, detailed backgrounds, and foreground elements).
2. Not overtly sexual. The SFW subset of Danbooru 2021 which this dataset is based on has
already removed almost all nudity, this criterion simply removes some of the borderline NSFW images.
3. Somewhat aesthetic.
Please note that dataset samples were selected manually, so there was somewhat of a tradeoff between
speed and accuracy. Despite this, I've managed to train a binary classifier up to 85% test
accuracy on this dataset. I've also observed pretty good performance when using that classifier on
entirely new portions of the base dataset.
## Original Dataset Citation
| [
"# Danbooru SFW 512 General Filter Dataset\nThis dataset is meant to be used for training a simple binary classifier that can filter the \nDanbooru SFW 2021 dataset. There are two classes: \"accepted\" and \"rejected\", with \"accepted\" \nrepresenting samples that should pass through the filter and \"rejected\" representing samples \nthat should not. \n\nGenerallly, to be accepted a sample should meet the following criteria:\n\n1. No horizontal/vertical bars, or bars that can be removed easily by filling in the background\ncolor. This precludes any samples with bars which cut through any part of the image that is not\na solid color block (i.e. patterns, detailed backgrounds, and foreground elements).\n\n2. Not overtly sexual. The SFW subset of Danbooru 2021 which this dataset is based on has \nalready removed almost all nudity, this criterion simply removes some of the borderline NSFW images.\n\n3. Somewhat aesthetic. \n\nPlease note that dataset samples were selected manually, so there was somewhat of a tradeoff between \nspeed and accuracy. Despite this, I've managed to train a binary classifier up to 85% test\naccuracy on this dataset. I've also observed pretty good performance when using that classifier on \nentirely new portions of the base dataset.",
"## Original Dataset Citation"
]
| [
"TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #region-us \n",
"# Danbooru SFW 512 General Filter Dataset\nThis dataset is meant to be used for training a simple binary classifier that can filter the \nDanbooru SFW 2021 dataset. There are two classes: \"accepted\" and \"rejected\", with \"accepted\" \nrepresenting samples that should pass through the filter and \"rejected\" representing samples \nthat should not. \n\nGenerallly, to be accepted a sample should meet the following criteria:\n\n1. No horizontal/vertical bars, or bars that can be removed easily by filling in the background\ncolor. This precludes any samples with bars which cut through any part of the image that is not\na solid color block (i.e. patterns, detailed backgrounds, and foreground elements).\n\n2. Not overtly sexual. The SFW subset of Danbooru 2021 which this dataset is based on has \nalready removed almost all nudity, this criterion simply removes some of the borderline NSFW images.\n\n3. Somewhat aesthetic. \n\nPlease note that dataset samples were selected manually, so there was somewhat of a tradeoff between \nspeed and accuracy. Despite this, I've managed to train a binary classifier up to 85% test\naccuracy on this dataset. I've also observed pretty good performance when using that classifier on \nentirely new portions of the base dataset.",
"## Original Dataset Citation"
]
|
5c24679f02db60c3911b8bd86a1cc95c16fd4220 | # Ruby dataset
**Custom ruby dataset**
- rspec_dataset
**Bigcode dataset**
- ruby-dataset
- shell-dataset
- python-dataset
- sql-dataset
## rspec dataset
Specs are exclusively gathered from the 'app/services' directory within the specified repositories. This approach is employed since the majority of business logic is encapsulated within these services
```py
REPO_URLS = [
'https://github.com/diaspora/diaspora.git',
'https://github.com/mastodon/mastodon.git',
'https://github.com/gitlabhq/gitlabhq.git',
'https://github.com/discourse/discourse.git',
'https://github.com/chatwoot/chatwoot.git',
'https://github.com/opf/openproject.git',
]
```
output
```sh
Repository Avg Source Lines Avg Test Lines Test Cases
diaspora 62 156 12
mastodon 97 131 59
gitlabhq 66 154 952
discourse 188 303 49
chatwoot 63 107 50
openproject 86 178 98
------------------------------------------------------------
Total 74 159 1220
------------------------------------------------------------
# avg_source_lines = [62, 97, 66, 188, 63, 86]
# avg_test_lines = [156, 131, 154, 303, 107, 178]
# test_cases = [12, 59, 952, 49, 50, 98]
# Assuming an average of 10 tokens per line of code, which is a rough average for programming languages
# tokens_per_line = 10
# Calculating the total tokens for source and test lines
# total_source_tokens = sum([lines * tokens_per_line for lines in avg_source_lines])
# total_test_tokens = sum([lines * tokens_per_line for lines in avg_test_lines])
# Total tokens
# total_tokens = total_source_tokens + total_test_tokens
# Average tokens per test case
# avg_tokens_per_test_case = total_tokens / sum(test_cases)
# total_tokens, avg_tokens_per_test_case
# -> (15910, 13.040983606557377)
```
When you prepare data for training or inference with an LLM, each example (in this case, each test case or code snippet) needs to fit within this context window. The average tokens per test case calculated earlier (approximately 13.04 tokens) is well within the limits of LLMs
| jaigouk/coding-dataset | [
"region:us"
]
| 2023-12-06T23:39:36+00:00 | {} | 2024-01-02T19:03:59+00:00 | []
| []
| TAGS
#region-us
| # Ruby dataset
Custom ruby dataset
- rspec_dataset
Bigcode dataset
- ruby-dataset
- shell-dataset
- python-dataset
- sql-dataset
## rspec dataset
Specs are exclusively gathered from the 'app/services' directory within the specified repositories. This approach is employed since the majority of business logic is encapsulated within these services
output
When you prepare data for training or inference with an LLM, each example (in this case, each test case or code snippet) needs to fit within this context window. The average tokens per test case calculated earlier (approximately 13.04 tokens) is well within the limits of LLMs
| [
"# Ruby dataset\n\nCustom ruby dataset\n\n- rspec_dataset\n\nBigcode dataset\n\n- ruby-dataset\n- shell-dataset\n- python-dataset\n- sql-dataset",
"## rspec dataset\n\nSpecs are exclusively gathered from the 'app/services' directory within the specified repositories. This approach is employed since the majority of business logic is encapsulated within these services\n\n\noutput\n\n\n\nWhen you prepare data for training or inference with an LLM, each example (in this case, each test case or code snippet) needs to fit within this context window. The average tokens per test case calculated earlier (approximately 13.04 tokens) is well within the limits of LLMs"
]
| [
"TAGS\n#region-us \n",
"# Ruby dataset\n\nCustom ruby dataset\n\n- rspec_dataset\n\nBigcode dataset\n\n- ruby-dataset\n- shell-dataset\n- python-dataset\n- sql-dataset",
"## rspec dataset\n\nSpecs are exclusively gathered from the 'app/services' directory within the specified repositories. This approach is employed since the majority of business logic is encapsulated within these services\n\n\noutput\n\n\n\nWhen you prepare data for training or inference with an LLM, each example (in this case, each test case or code snippet) needs to fit within this context window. The average tokens per test case calculated earlier (approximately 13.04 tokens) is well within the limits of LLMs"
]
|
ed08ab7b779f9e07edef6339690b09b6ae3236d6 | # Dataset Card for "librispeech960-wavlm-large-km1000_asr_tokenized_final_fixed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | cmu-mlsp/librispeech960-wavlm-large-km1000_asr_tokenized_final_fixed | [
"region:us"
]
| 2023-12-06T23:49:59+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "validation_tts", "path": "data/validation_tts-*"}, {"split": "test", "path": "data/test-*"}, {"split": "test_tts", "path": "data/test_tts-*"}]}], "dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 5169983912, "num_examples": 562482}, {"name": "validation", "num_bytes": 29571960, "num_examples": 5406}, {"name": "validation_tts", "num_bytes": 14785980, "num_examples": 2703}, {"name": "test", "num_bytes": 6104987, "num_examples": 2620}, {"name": "test_tts", "num_bytes": 8664977, "num_examples": 2620}], "download_size": 836237002, "dataset_size": 5229111816}} | 2023-12-07T18:10:56+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "librispeech960-wavlm-large-km1000_asr_tokenized_final_fixed"
More Information needed | [
"# Dataset Card for \"librispeech960-wavlm-large-km1000_asr_tokenized_final_fixed\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"librispeech960-wavlm-large-km1000_asr_tokenized_final_fixed\"\n\nMore Information needed"
]
|
4f9fc0f7d0dc1786c8c32994593d5509529168ea | # Dataset Card for "my-nft-prompt-and-sale-label"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hongerzh/my-nft-prompt-and-sale-label | [
"region:us"
]
| 2023-12-06T23:51:29+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}, {"name": "label", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 5747469808.67, "num_examples": 29339}, {"name": "validation", "num_bytes": 1910439936.185, "num_examples": 9777}, {"name": "test", "num_bytes": 2129410854.38, "num_examples": 9780}], "download_size": 9022431797, "dataset_size": 9787320599.235}} | 2023-12-07T01:03:14+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "my-nft-prompt-and-sale-label"
More Information needed | [
"# Dataset Card for \"my-nft-prompt-and-sale-label\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"my-nft-prompt-and-sale-label\"\n\nMore Information needed"
]
|
72f354c0567a303e3231f085897f5e35945a4d14 | This dataset is generated by [Lilac](http://lilacml.com) for a HuggingFace Space: [huggingface.co/spaces/lilacai/lilac](https://huggingface.co/spaces/lilacai/lilac).
Original dataset: [https://huggingface.co/datasets/glaiveai/glaive-code-assistant](https://huggingface.co/datasets/glaiveai/glaive-code-assistant)
Lilac dataset config:
```namespace: local
name: glaive
source:
dataset_name: glaiveai/glaive-code-assistant
source_name: huggingface
settings:
ui:
media_paths:
- question
- answer
- - answer_formatted
- answer
markdown_paths: []
```
| lilacai/local-glaive | [
"region:us"
]
| 2023-12-07T00:10:35+00:00 | {} | 2023-12-07T12:29:36+00:00 | []
| []
| TAGS
#region-us
| This dataset is generated by Lilac for a HuggingFace Space: URL
Original dataset: URL
Lilac dataset config:
| []
| [
"TAGS\n#region-us \n"
]
|
7ded40b06dd8a9ecfdbc805b47d71851fc6a315c | This dataset is generated by [Lilac](http://lilacml.com) for a HuggingFace Space: [huggingface.co/spaces/lilacai/lilac](https://huggingface.co/spaces/lilacai/lilac).
Lilac dataset config:
```namespace: lilac
name: stanford-alpaca
source:
filepaths:
- https://raw.githubusercontent.com/tatsu-lab/stanford_alpaca/main/alpaca_data.json
source_name: json
embeddings:
- path: output
embedding: gte-small
- path: instruction
embedding: gte-small
- path: input
embedding: gte-small
signals:
- path: output
signal:
signal_name: pii
- path: output
signal:
signal_name: text_statistics
- path: output
signal:
signal_name: near_dup
- path: output
signal:
signal_name: lang_detection
- path: output
signal:
embedding: gte-small
signal_name: cluster_hdbscan
- path: output
signal:
embedding: gte-small
namespace: lilac
concept_name: legal-termination
signal_name: concept_score
- path: output
signal:
embedding: gte-small
namespace: lilac
concept_name: negative-sentiment
signal_name: concept_score
- path: output
signal:
embedding: gte-small
namespace: lilac
concept_name: non-english
signal_name: concept_score
- path: output
signal:
embedding: gte-small
namespace: lilac
concept_name: positive-sentiment
signal_name: concept_score
- path: output
signal:
embedding: gte-small
namespace: lilac
concept_name: profanity
signal_name: concept_score
- path: output
signal:
embedding: gte-small
namespace: lilac
concept_name: question
signal_name: concept_score
- path: output
signal:
embedding: gte-small
namespace: lilac
concept_name: source-code
signal_name: concept_score
- path: output
signal:
embedding: gte-small
namespace: lilac
concept_name: toxicity
signal_name: concept_score
- path: instruction
signal:
signal_name: pii
- path: instruction
signal:
signal_name: text_statistics
- path: instruction
signal:
signal_name: near_dup
- path: instruction
signal:
signal_name: lang_detection
- path: instruction
signal:
embedding: gte-small
signal_name: cluster_hdbscan
- path: instruction
signal:
embedding: gte-small
namespace: lilac
concept_name: legal-termination
signal_name: concept_score
- path: instruction
signal:
embedding: gte-small
namespace: lilac
concept_name: negative-sentiment
signal_name: concept_score
- path: instruction
signal:
embedding: gte-small
namespace: lilac
concept_name: non-english
signal_name: concept_score
- path: instruction
signal:
embedding: gte-small
namespace: lilac
concept_name: positive-sentiment
signal_name: concept_score
- path: instruction
signal:
embedding: gte-small
namespace: lilac
concept_name: profanity
signal_name: concept_score
- path: instruction
signal:
embedding: gte-small
namespace: lilac
concept_name: question
signal_name: concept_score
- path: instruction
signal:
embedding: gte-small
namespace: lilac
concept_name: source-code
signal_name: concept_score
- path: instruction
signal:
embedding: gte-small
namespace: lilac
concept_name: toxicity
signal_name: concept_score
- path: input
signal:
signal_name: pii
- path: input
signal:
signal_name: text_statistics
- path: input
signal:
signal_name: near_dup
- path: input
signal:
signal_name: lang_detection
- path: input
signal:
embedding: gte-small
signal_name: cluster_hdbscan
- path: input
signal:
embedding: gte-small
namespace: lilac
concept_name: legal-termination
signal_name: concept_score
- path: input
signal:
embedding: gte-small
namespace: lilac
concept_name: negative-sentiment
signal_name: concept_score
- path: input
signal:
embedding: gte-small
namespace: lilac
concept_name: non-english
signal_name: concept_score
- path: input
signal:
embedding: gte-small
namespace: lilac
concept_name: positive-sentiment
signal_name: concept_score
- path: input
signal:
embedding: gte-small
namespace: lilac
concept_name: profanity
signal_name: concept_score
- path: input
signal:
embedding: gte-small
namespace: lilac
concept_name: question
signal_name: concept_score
- path: input
signal:
embedding: gte-small
namespace: lilac
concept_name: source-code
signal_name: concept_score
- path: input
signal:
embedding: gte-small
namespace: lilac
concept_name: toxicity
signal_name: concept_score
settings:
ui:
media_paths:
- output
- instruction
- input
markdown_paths: []
```
| lilacai/lilac-stanford-alpaca | [
"region:us"
]
| 2023-12-07T00:10:54+00:00 | {} | 2023-12-07T13:57:21+00:00 | []
| []
| TAGS
#region-us
| This dataset is generated by Lilac for a HuggingFace Space: URL
Lilac dataset config:
| []
| [
"TAGS\n#region-us \n"
]
|
e0349a2dcff56030cdbf9e0169ac0beebe4d9911 | # lilac/glaive
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/glaiveai/glaive-code-assistant](https://huggingface.co/datasets/glaiveai/glaive-code-assistant)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-glaive
```
or from python with:
```py
ll.download("lilacai/lilac-glaive")
```
| lilacai/lilac-glaive | [
"Lilac",
"region:us"
]
| 2023-12-07T00:12:07+00:00 | {"tags": ["Lilac"]} | 2024-01-17T22:31:55+00:00 | []
| []
| TAGS
#Lilac #region-us
| # lilac/glaive
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/glaive\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
]
| [
"TAGS\n#Lilac #region-us \n",
"# lilac/glaive\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
]
|
93134ca62af8b22d25c5b576b7f23774e7635eef |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | hakanhkl/nimohakan | [
"region:us"
]
| 2023-12-07T00:33:43+00:00 | {} | 2023-12-07T00:35:10+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
]
|
caec95ae86aed8aa92681e5e24490efda4fdd644 | # Dataset Card for "20231206_chai_prize_reward_model_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ChaiML/20231206_chai_prize_reward_model_data | [
"region:us"
]
| 2023-12-07T00:42:19+00:00 | {"dataset_info": {"features": [{"name": "input_text", "dtype": "string"}, {"name": "labels", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 147932875, "num_examples": 87763}], "download_size": 83237596, "dataset_size": 147932875}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-15T10:27:25+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "20231206_chai_prize_reward_model_data"
More Information needed | [
"# Dataset Card for \"20231206_chai_prize_reward_model_data\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"20231206_chai_prize_reward_model_data\"\n\nMore Information needed"
]
|
3a1e6e3c990fa9a148630fe5efc8acee57fa27d2 |
# UltrachatBR: Um Dataset em Português baseado no Ultrachat
O UltrachatBR é uma versão em português do conhecido dataset [Ultrachat](https://huggingface.co/datasets/stingning/ultrachat), originalmente desenvolvido para o idioma inglês. Este projeto visa disponibilizar uma vasta coleção de diálogos traduzidos para o português, ampliando assim o acesso a recursos de processamento de linguagem natural para a comunidade de língua portuguesa.
## Processo de Tradução
O processo de tradução foi realizado utilizando a API do Google Tradutor, uma ferramenta poderosa para a tradução automática. Ao aplicar essa tecnologia, conseguimos traduzir mais de 1.4 milhão de amostras de diálogos do Ultrachat original para o português. Este passo foi crucial para garantir que os modelos de processamento de linguagem natural tenham um conjunto de dados significativo no idioma português.
## Desafios Enfrentados
Durante a tradução, identificamos algumas palavras-chave essenciais, como 'human' e 'assistant', que não foram traduzidas, pois são termos específicos do dataset Ultrachat. Para garantir a coesão e consistência, realizamos um pré-processamento adicional, substituindo essas palavras por suas equivalentes em português, resultando em "humano" e "assistente".
## Pré-processamento
Além da tradução, realizamos um cuidadoso pré-processamento dos dados para garantir a qualidade e a consistência. Isso incluiu a limpeza de dados irrelevantes, tratamento de valores nulos e a normalização de termos específicos.
## Tamanho do Dataset
O UltrachatBR abrange um impressionante conjunto de 1.453.026 amostras de diálogos em português. Essa riqueza de dados oferece uma base sólida para o treinamento e avaliação de modelos de processamento de linguagem natural voltados para o público de língua portuguesa.
## Como Citar
Se você utilizar o UltrachatBR em suas pesquisas, por favor, cite o dataset da seguinte maneira:
```
@misc{ultrachatBr,
author = {Garcia, Gabriel Lino and Paiola, Pedro Henrique and Frediani, João Otávio and Morelli, Luis Henrique and Correia, João Vitor Mariano and Jodas, Danilo Samuel and Junior, Arnaldo Candido and Penteado, Bruno Elias and Guilherme, Ivan Rizzo and Papa, João Paulo},
title = {UltrachatBR: Um Dataset em Português baseado no Ultrachat},
year = {2023},
doi = {10.57967/hf/1433},
publisher = {Hugging Face},
url = {https://huggingface.co/datasets/recogna-nlp/UltrachatBR}
}
```
Esperamos que o UltrachatBR seja uma valiosa contribuição para a comunidade de processamento de linguagem natural em língua portuguesa. Fique à vontade para explorar e utilizar este recurso em seus projetos!
| recogna-nlp/UltrachatBR | [
"task_categories:conversational",
"task_categories:text-generation",
"size_categories:1M<n<10M",
"language:pt",
"license:mit",
"Dataset",
"Portuguese",
"Ultrachat",
"LLM",
"doi:10.57967/hf/1433",
"region:us"
]
| 2023-12-07T01:13:04+00:00 | {"language": ["pt"], "license": "mit", "size_categories": ["1M<n<10M"], "task_categories": ["conversational", "text-generation"], "tags": ["Dataset", "Portuguese", "Ultrachat", "LLM"]} | 2023-12-08T00:34:29+00:00 | []
| [
"pt"
]
| TAGS
#task_categories-conversational #task_categories-text-generation #size_categories-1M<n<10M #language-Portuguese #license-mit #Dataset #Portuguese #Ultrachat #LLM #doi-10.57967/hf/1433 #region-us
|
# UltrachatBR: Um Dataset em Português baseado no Ultrachat
O UltrachatBR é uma versão em português do conhecido dataset Ultrachat, originalmente desenvolvido para o idioma inglês. Este projeto visa disponibilizar uma vasta coleção de diálogos traduzidos para o português, ampliando assim o acesso a recursos de processamento de linguagem natural para a comunidade de língua portuguesa.
## Processo de Tradução
O processo de tradução foi realizado utilizando a API do Google Tradutor, uma ferramenta poderosa para a tradução automática. Ao aplicar essa tecnologia, conseguimos traduzir mais de 1.4 milhão de amostras de diálogos do Ultrachat original para o português. Este passo foi crucial para garantir que os modelos de processamento de linguagem natural tenham um conjunto de dados significativo no idioma português.
## Desafios Enfrentados
Durante a tradução, identificamos algumas palavras-chave essenciais, como 'human' e 'assistant', que não foram traduzidas, pois são termos específicos do dataset Ultrachat. Para garantir a coesão e consistência, realizamos um pré-processamento adicional, substituindo essas palavras por suas equivalentes em português, resultando em "humano" e "assistente".
## Pré-processamento
Além da tradução, realizamos um cuidadoso pré-processamento dos dados para garantir a qualidade e a consistência. Isso incluiu a limpeza de dados irrelevantes, tratamento de valores nulos e a normalização de termos específicos.
## Tamanho do Dataset
O UltrachatBR abrange um impressionante conjunto de 1.453.026 amostras de diálogos em português. Essa riqueza de dados oferece uma base sólida para o treinamento e avaliação de modelos de processamento de linguagem natural voltados para o público de língua portuguesa.
## Como Citar
Se você utilizar o UltrachatBR em suas pesquisas, por favor, cite o dataset da seguinte maneira:
Esperamos que o UltrachatBR seja uma valiosa contribuição para a comunidade de processamento de linguagem natural em língua portuguesa. Fique à vontade para explorar e utilizar este recurso em seus projetos!
| [
"# UltrachatBR: Um Dataset em Português baseado no Ultrachat\n\nO UltrachatBR é uma versão em português do conhecido dataset Ultrachat, originalmente desenvolvido para o idioma inglês. Este projeto visa disponibilizar uma vasta coleção de diálogos traduzidos para o português, ampliando assim o acesso a recursos de processamento de linguagem natural para a comunidade de língua portuguesa.",
"## Processo de Tradução\n\nO processo de tradução foi realizado utilizando a API do Google Tradutor, uma ferramenta poderosa para a tradução automática. Ao aplicar essa tecnologia, conseguimos traduzir mais de 1.4 milhão de amostras de diálogos do Ultrachat original para o português. Este passo foi crucial para garantir que os modelos de processamento de linguagem natural tenham um conjunto de dados significativo no idioma português.",
"## Desafios Enfrentados\n\nDurante a tradução, identificamos algumas palavras-chave essenciais, como 'human' e 'assistant', que não foram traduzidas, pois são termos específicos do dataset Ultrachat. Para garantir a coesão e consistência, realizamos um pré-processamento adicional, substituindo essas palavras por suas equivalentes em português, resultando em \"humano\" e \"assistente\".",
"## Pré-processamento\n\nAlém da tradução, realizamos um cuidadoso pré-processamento dos dados para garantir a qualidade e a consistência. Isso incluiu a limpeza de dados irrelevantes, tratamento de valores nulos e a normalização de termos específicos.",
"## Tamanho do Dataset\n\nO UltrachatBR abrange um impressionante conjunto de 1.453.026 amostras de diálogos em português. Essa riqueza de dados oferece uma base sólida para o treinamento e avaliação de modelos de processamento de linguagem natural voltados para o público de língua portuguesa.",
"## Como Citar\n\nSe você utilizar o UltrachatBR em suas pesquisas, por favor, cite o dataset da seguinte maneira:\n\n\nEsperamos que o UltrachatBR seja uma valiosa contribuição para a comunidade de processamento de linguagem natural em língua portuguesa. Fique à vontade para explorar e utilizar este recurso em seus projetos!"
]
| [
"TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-1M<n<10M #language-Portuguese #license-mit #Dataset #Portuguese #Ultrachat #LLM #doi-10.57967/hf/1433 #region-us \n",
"# UltrachatBR: Um Dataset em Português baseado no Ultrachat\n\nO UltrachatBR é uma versão em português do conhecido dataset Ultrachat, originalmente desenvolvido para o idioma inglês. Este projeto visa disponibilizar uma vasta coleção de diálogos traduzidos para o português, ampliando assim o acesso a recursos de processamento de linguagem natural para a comunidade de língua portuguesa.",
"## Processo de Tradução\n\nO processo de tradução foi realizado utilizando a API do Google Tradutor, uma ferramenta poderosa para a tradução automática. Ao aplicar essa tecnologia, conseguimos traduzir mais de 1.4 milhão de amostras de diálogos do Ultrachat original para o português. Este passo foi crucial para garantir que os modelos de processamento de linguagem natural tenham um conjunto de dados significativo no idioma português.",
"## Desafios Enfrentados\n\nDurante a tradução, identificamos algumas palavras-chave essenciais, como 'human' e 'assistant', que não foram traduzidas, pois são termos específicos do dataset Ultrachat. Para garantir a coesão e consistência, realizamos um pré-processamento adicional, substituindo essas palavras por suas equivalentes em português, resultando em \"humano\" e \"assistente\".",
"## Pré-processamento\n\nAlém da tradução, realizamos um cuidadoso pré-processamento dos dados para garantir a qualidade e a consistência. Isso incluiu a limpeza de dados irrelevantes, tratamento de valores nulos e a normalização de termos específicos.",
"## Tamanho do Dataset\n\nO UltrachatBR abrange um impressionante conjunto de 1.453.026 amostras de diálogos em português. Essa riqueza de dados oferece uma base sólida para o treinamento e avaliação de modelos de processamento de linguagem natural voltados para o público de língua portuguesa.",
"## Como Citar\n\nSe você utilizar o UltrachatBR em suas pesquisas, por favor, cite o dataset da seguinte maneira:\n\n\nEsperamos que o UltrachatBR seja uma valiosa contribuição para a comunidade de processamento de linguagem natural em língua portuguesa. Fique à vontade para explorar e utilizar este recurso em seus projetos!"
]
|
382d6dc1cf574172fde9dbe56f709f27eb96c6da | # Dataset Card for "llama2-path-concentration-1.5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | fightfei/llama2-path-concentration-1.5k | [
"region:us"
]
| 2023-12-07T01:23:32+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 600075.1011080332, "num_examples": 1429}, {"name": "test", "num_bytes": 6298.898891966759, "num_examples": 15}], "download_size": 78423, "dataset_size": 606374.0}} | 2023-12-07T01:23:59+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "llama2-path-concentration-1.5k"
More Information needed | [
"# Dataset Card for \"llama2-path-concentration-1.5k\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"llama2-path-concentration-1.5k\"\n\nMore Information needed"
]
|
39453112de7775dc0f3decf9f1a0b3211bad66d1 | # EverythingLM
This is a version of [totally-not-an-llm/EverythingLM-data-V3](https://huggingface.co/datasets/totally-not-an-llm/EverythingLM-data-V3) formatted for instruction fine-tuning using the following
prompt template:
```
### Instruction:
Instruction
### Input:
Input
### Response:
``` | mwitiderrick/EverythingLM | [
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:en",
"license:mit",
"region:us"
]
| 2023-12-07T04:33:34+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["question-answering", "text-generation"], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4817562, "num_examples": 1073}], "download_size": 1633970, "dataset_size": 4817562}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-08T13:37:38+00:00 | []
| [
"en"
]
| TAGS
#task_categories-question-answering #task_categories-text-generation #size_categories-1K<n<10K #language-English #license-mit #region-us
| # EverythingLM
This is a version of totally-not-an-llm/EverythingLM-data-V3 formatted for instruction fine-tuning using the following
prompt template:
| [
"# EverythingLM\nThis is a version of totally-not-an-llm/EverythingLM-data-V3 formatted for instruction fine-tuning using the following \nprompt template:"
]
| [
"TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-1K<n<10K #language-English #license-mit #region-us \n",
"# EverythingLM\nThis is a version of totally-not-an-llm/EverythingLM-data-V3 formatted for instruction fine-tuning using the following \nprompt template:"
]
|
38b770e81377d77c3d9bba10924a668f5bc4bb3a |
# IllumiCore-1k: Llama2 Formatting
This is a VNF resource allocation dataset (1000 samples) generated by IllumiCore [1], processed to match Llama 2's prompt format [2]:
```
<s>[INST] <<SYS>>
{{ system_prompt }}
<</SYS>>
{{ user_msg_1 }} [/INST] {{ model_answer_1 }} </s><s>[INST] {{ user_msg_2 }} [/INST] {{ model_answer_1 }} </s>
```
Here is an example of a dataset record:
```
<s>[INST] <<SYS>> As a telecommunication realm expert with professional knowledge of network function virtualization and software-defined networks, your task is to assist the user in answering questions related to network functions virtualization infrastructure. Please provide clear and detailed explanations, drawing upon your expertise in the field to address the user's inquiries effectively. Your responses should be informative and accurate, catering to the user's specific questions and providing valuable insights into network function virtualization infrastructure. Ensure that your explanations are accessible to users with varying levels of familiarity with the subject matter, and feel free to incorporate relevant examples or analogies to enhance understanding.
Here is the current NFVI topology tree:
NFVI
└── Core Switch
├── Data Center 1
│ ├── Aggregation Switch 1
│ │ ├── +---------------+
│ │ │ | Top of Rack 1 |
│ │ │ +---------------+
│ │ ├── +---------------+
│ │ │ | Top of Rack 2 |
│ │ │ +---------------+
│ │ └── +---------------+
│ │ | Top of Rack 3 |
│ │ +---------------+
│ ├── Aggregation Switch 2
│ │ ├── +---------------+
│ │ │ | Top of Rack 1 |
│ │ │ +---------------+
│ │ ├── +---------------+
│ │ │ | Top of Rack 2 |
│ │ │ +---------------+
│ │ └── +---------------+
│ │ | Top of Rack 3 |
│ │ +---------------+
│ ├── Rack 1
│ │ ├── Top of Rack 1
│ │ ├── ╔═══════════════╗
│ │ │ ║ Server 1 ║
│ │ │ ║ | ║
│ │ │ ║ CH1 ║
│ │ │ ║ CPU: 48 Cores ║
│ │ │ ║ Mem: 408 GB ║
│ │ │ ║ Net: 1 GB ║
│ │ │ ╚═══════════════╝
│ │ ├── ╔═══════════════╗
│ │ │ ║ Server 2 ║
│ │ │ ║ | ║
│ │ │ ║ CH2 ║
│ │ │ ║ CPU: 40 Cores ║
│ │ │ ║ Mem: 348 GB ║
│ │ │ ║ Net: 9 GB ║
│ │ │ ╚═══════════════╝
│ │ └── ╔═══════════════╗
│ │ ║ Server 3 ║
│ │ ║ | ║
│ │ ║ CH3 ║
│ │ ║ CPU: 50 Cores ║
│ │ ║ Mem: 392 GB ║
│ │ ║ Net: 3 GB ║
│ │ ╚═══════════════╝
│ ├── Rack 2
│ │ ├── Top of Rack 2
│ │ ├── ╔═══════════════╗
│ │ │ ║ Server 4 ║
│ │ │ ║ | ║
│ │ │ ║ CH4 ║
│ │ │ ║ CPU: 64 Cores ║
│ │ │ ║ Mem: 354 GB ║
│ │ │ ║ Net: 3 GB ║
│ │ │ ╚═══════════════╝
│ │ ├── ╔═══════════════╗
│ │ │ ║ Server 5 ║
│ │ │ ║ | ║
│ │ │ ║ CH5 ║
│ │ │ ║ CPU: 48 Cores ║
│ │ │ ║ Mem: 408 GB ║
│ │ │ ║ Net: 5 GB ║
│ │ │ ╚═══════════════╝
│ │ └── ╔═══════════════╗
│ │ ║ Server 6 ║
│ │ ║ | ║
│ │ ║ CH6 ║
│ │ ║ CPU: 54 Cores ║
│ │ ║ Mem: 206 GB ║
│ │ ║ Net: 1 GB ║
│ │ ╚═══════════════╝
│ └── Rack 3
│ ├── Top of Rack 3
│ ├── ╔═══════════════╗
│ │ ║ Server 7 ║
│ │ ║ | ║
│ │ ║ CH7 ║
│ │ ║ CPU: 64 Cores ║
│ │ ║ Mem: 412 GB ║
│ │ ║ Net: 9 GB ║
│ │ ╚═══════════════╝
│ ├── ╔═══════════════╗
│ │ ║ Server 8 ║
│ │ ║ | ║
│ │ ║ CH8 ║
│ │ ║ CPU: 40 Cores ║
│ │ ║ Mem: 314 GB ║
│ │ ║ Net: 5 GB ║
│ │ ╚═══════════════╝
│ └── ╔═══════════════╗
│ ║ Server 9 ║
│ ║ | ║
│ ║ CH9 ║
│ ║ CPU: 60 Cores ║
│ ║ Mem: 402 GB ║
│ ║ Net: 3 GB ║
│ ╚═══════════════╝
└── Data Center 2
├── Aggregation Switch 3
│ ├── +---------------+
│ │ | Top of Rack 4 |
│ │ +---------------+
│ ├── +---------------+
│ │ | Top of Rack 5 |
│ │ +---------------+
│ └── +---------------+
│ | Top of Rack 6 |
│ +---------------+
├── Aggregation Switch 4
│ ├── +---------------+
│ │ | Top of Rack 4 |
│ │ +---------------+
│ ├── +---------------+
│ │ | Top of Rack 5 |
│ │ +---------------+
│ └── +---------------+
│ | Top of Rack 6 |
│ +---------------+
├── Rack 4
│ ├── Top of Rack 4
│ ├── ╔═══════════════╗
│ │ ║ Server 10 ║
│ │ ║ | ║
│ │ ║ CH10 ║
│ │ ║ CPU: 58 Cores ║
│ │ ║ Mem: 350 GB ║
│ │ ║ Net: 7 GB ║
│ │ ╚═══════════════╝
│ ├── ╔═══════════════╗
│ │ ║ Server 11 ║
│ │ ║ | ║
│ │ ║ CH11 ║
│ │ ║ CPU: 46 Cores ║
│ │ ║ Mem: 198 GB ║
│ │ ║ Net: 5 GB ║
│ │ ╚═══════════════╝
│ └── ╔═══════════════╗
│ ║ Server 12 ║
│ ║ | ║
│ ║ CH12 ║
│ ║ CPU: 58 Cores ║
│ ║ Mem: 234 GB ║
│ ║ Net: 1 GB ║
│ ╚═══════════════╝
├── Rack 5
│ ├── Top of Rack 5
│ ├── ╔═══════════════╗
│ │ ║ Server 13 ║
│ │ ║ | ║
│ │ ║ CH13 ║
│ │ ║ CPU: 42 Cores ║
│ │ ║ Mem: 348 GB ║
│ │ ║ Net: 7 GB ║
│ │ ╚═══════════════╝
│ ├── ╔═══════════════╗
│ │ ║ Server 14 ║
│ │ ║ | ║
│ │ ║ CH14 ║
│ │ ║ CPU: 56 Cores ║
│ │ ║ Mem: 180 GB ║
│ │ ║ Net: 3 GB ║
│ │ ╚═══════════════╝
│ └── ╔═══════════════╗
│ ║ Server 15 ║
│ ║ | ║
│ ║ CH15 ║
│ ║ CPU: 44 Cores ║
│ ║ Mem: 310 GB ║
│ ║ Net: 1 GB ║
│ ╚═══════════════╝
└── Rack 6
├── Top of Rack 6
├── ╔═══════════════╗
│ ║ Server 16 ║
│ ║ | ║
│ ║ CH16 ║
│ ║ CPU: 50 Cores ║
│ ║ Mem: 254 GB ║
│ ║ Net: 1 GB ║
│ ╚═══════════════╝
├── ╔═══════════════╗
│ ║ Server 17 ║
│ ║ | ║
│ ║ CH17 ║
│ ║ CPU: 34 Cores ║
│ ║ Mem: 478 GB ║
│ ║ Net: 7 GB ║
│ ╚═══════════════╝
└── ╔═══════════════╗
║ Server 18 ║
║ | ║
║ CH18 ║
║ CPU: 46 Cores ║
║ Mem: 454 GB ║
║ Net: 3 GB ║
╚═══════════════╝
<</SYS>> Can you show me the current NFVI network topology diagram? [/INST] The Network Functions Virtualization Infrastructure (NFVI) consists of a core switch that connects two data centers. Data Center 1 comprises three racks, each containing servers with varying configurations. Each rack is connected to the network via two aggregation switches. The first rack houses three servers with different core counts, memory capacities, and network speeds. The same configuration exists for the second and third racks. Similarly, Data Center 2 is designed with the same architecture, hosting three more racks, each connected via two additional aggregation switches and containing three servers each. The servers across all racks display a diverse range of core counts, memory capacities, and network speeds, providing a robust and versatile NFVI. </s><s>[INST] Can I see a sample of the current VNF resource allocation? [/INST] The resource allocation results indicate that the Network Functions Virtualization Infrastructure Management (VIM) has allocated servers to top of rack (TOR) switches optimally. The servers 1 to 6, 7 to 12, and 13 to 18 are allocated to VIM 1, 2, and 3 respectively. Most servers remain unoccupied, except for servers 13 and 14. Server 13, with a total capacity of 42 CPU cores, 348 GB memory, and 7 GB network, has four virtual machines (VMs) assigned to it, consuming 36 cores, 99 GB memory, and 4 GB network. The remaining capacity is 6 cores, 249 GB memory, and 3 GB network. Server 14, with a total capacity of 56 cores, 180 GB memory, and 3 GB network, has two VMs assigned, consuming 25 cores, 71 GB memory, and 2 GB network. The remaining capacity is 31 cores, 109 GB memory, and 1 GB network. The allocation has achieved an optimal status, demonstrating efficient resource distribution. </s>
```
---
[1] L. Popokh, J. Su, S. Nair and E. Olinick, "IllumiCore: Optimization Modeling and Implementation for Efficient VNF Placement," 2021 International Conference on Software, Telecommunications and Computer Networks (SoftCOM), Split, Hvar, Croatia, 2021, pp. 1-7, doi: 10.23919/SoftCOM52868.2021.9559076.
[2] [How to Prompt Llama 2](https://huggingface.co/blog/llama2#how-to-prompt-llama-2).
| arm/illumicore-llama2-1k | [
"task_categories:text-generation",
"task_categories:summarization",
"size_categories:1K<n<10K",
"language:en",
"license:llama2",
"region:us"
]
| 2023-12-07T04:38:29+00:00 | {"language": ["en"], "license": "llama2", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation", "summarization"], "pretty_name": "IllumiCore-Llama2-1k", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 12515626, "num_examples": 1000}], "download_size": 2510028, "dataset_size": 12515626}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-14T03:51:21+00:00 | []
| [
"en"
]
| TAGS
#task_categories-text-generation #task_categories-summarization #size_categories-1K<n<10K #language-English #license-llama2 #region-us
|
# IllumiCore-1k: Llama2 Formatting
This is a VNF resource allocation dataset (1000 samples) generated by IllumiCore [1], processed to match Llama 2's prompt format [2]:
Here is an example of a dataset record:
---
[1] L. Popokh, J. Su, S. Nair and E. Olinick, "IllumiCore: Optimization Modeling and Implementation for Efficient VNF Placement," 2021 International Conference on Software, Telecommunications and Computer Networks (SoftCOM), Split, Hvar, Croatia, 2021, pp. 1-7, doi: 10.23919/SoftCOM52868.2021.9559076.
[2] How to Prompt Llama 2.
| [
"# IllumiCore-1k: Llama2 Formatting\n\nThis is a VNF resource allocation dataset (1000 samples) generated by IllumiCore [1], processed to match Llama 2's prompt format [2]:\n\n\n\nHere is an example of a dataset record:\n\n\n\n---\n\n[1] L. Popokh, J. Su, S. Nair and E. Olinick, \"IllumiCore: Optimization Modeling and Implementation for Efficient VNF Placement,\" 2021 International Conference on Software, Telecommunications and Computer Networks (SoftCOM), Split, Hvar, Croatia, 2021, pp. 1-7, doi: 10.23919/SoftCOM52868.2021.9559076.\n\n[2] How to Prompt Llama 2."
]
| [
"TAGS\n#task_categories-text-generation #task_categories-summarization #size_categories-1K<n<10K #language-English #license-llama2 #region-us \n",
"# IllumiCore-1k: Llama2 Formatting\n\nThis is a VNF resource allocation dataset (1000 samples) generated by IllumiCore [1], processed to match Llama 2's prompt format [2]:\n\n\n\nHere is an example of a dataset record:\n\n\n\n---\n\n[1] L. Popokh, J. Su, S. Nair and E. Olinick, \"IllumiCore: Optimization Modeling and Implementation for Efficient VNF Placement,\" 2021 International Conference on Software, Telecommunications and Computer Networks (SoftCOM), Split, Hvar, Croatia, 2021, pp. 1-7, doi: 10.23919/SoftCOM52868.2021.9559076.\n\n[2] How to Prompt Llama 2."
]
|
270580ce849ca740df81a155cd307cac87afa23c |
# Dataset of Suzukaze Aoba
This is the dataset of Suzukaze Aoba, containing 972 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 972 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 2261 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 2551 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 972 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 972 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 972 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 2261 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 2261 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 1949 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 2551 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 2551 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| CyberHarem/suzukaze_aoba_newgame | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
]
| 2023-12-07T04:52:00+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-07T05:40:16+00:00 | []
| []
| TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of Suzukaze Aoba
========================
This is the dataset of Suzukaze Aoba, containing 972 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
| []
| [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
]
|
18607faa1a9698aa8d4b7683937fb94a0c2c7060 |
# dolphin-coder

This dataset is transformed from https://www.kaggle.com/datasets/erichartford/leetcode-rosetta
it is used to train dolphin-coder model | cognitivecomputations/dolphin-coder | [
"language:en",
"license:apache-2.0",
"region:us"
]
| 2023-12-07T05:04:39+00:00 | {"language": ["en"], "license": "apache-2.0"} | 2023-12-07T06:46:14+00:00 | []
| [
"en"
]
| TAGS
#language-English #license-apache-2.0 #region-us
|
# dolphin-coder
!image/png
This dataset is transformed from URL
it is used to train dolphin-coder model | [
"# dolphin-coder \n\n\n!image/png\n\nThis dataset is transformed from URL\nit is used to train dolphin-coder model"
]
| [
"TAGS\n#language-English #license-apache-2.0 #region-us \n",
"# dolphin-coder \n\n\n!image/png\n\nThis dataset is transformed from URL\nit is used to train dolphin-coder model"
]
|
cac6ff9a2c27f82d4247687b717685cd27338083 | # Dataset Card for "xl-gen"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ShuoShuoShuo/xl-gen | [
"region:us"
]
| 2023-12-07T05:57:06+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 11288495272.3, "num_examples": 96150}], "download_size": 8284700494, "dataset_size": 11288495272.3}} | 2023-12-07T10:36:30+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "xl-gen"
More Information needed | [
"# Dataset Card for \"xl-gen\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"xl-gen\"\n\nMore Information needed"
]
|
b3e5a7b142f1282cecb6f7aef3310a9b4c178b9e |
## Dataset Description
- **Homepage:** https://www.wikiart.org/
### Dataset Summary
Dataset containing 81,444 pieces of visual art from various artists, taken from WikiArt.org,
along with class labels for each image :
* "artist" : 129 artist classes, including a "Unknown Artist" class
* "genre" : 11 genre classes, including a "Unknown Genre" class
* "style" : 27 style classes
On WikiArt.org, the description for the "Artworks by Genre" page reads :
A genre system divides artworks according to depicted themes and objects. A classical hierarchy of genres was developed in European culture by the 17th century. It ranked genres in high – history painting and portrait, - and low – genre painting, landscape and still life. This hierarchy was based on the notion of man as the measure of all things. Landscape and still life were the lowest because they did not involve human subject matter. History was highest because it dealt with the noblest events of humanity. Genre system is not so much relevant for a contemporary art; there are just two genre definitions that are usually applied to it: abstract or figurative.
The "Artworks by Style" page reads :
A style of an artwork refers to its distinctive visual elements, techniques and methods. It usually corresponds with an art movement or a school (group) that its author is associated with.
## Dataset Structure
* "image" : image
* "artist" : 129 artist classes, including a "Unknown Artist" class
* "genre" : 11 genre classes, including a "Unknown Genre" class
* "style" : 27 style classes
### Source Data
Files taken from this [archive](https://archive.org/download/wikiart-dataset/wikiart.tar.gz), curated from the [WikiArt website](https://www.wikiart.org/).
## Additional Information
Note:
* The WikiArt dataset can be used only for non-commercial research purpose.
* The images in the WikiArt dataset were obtained from WikiArt.org.
* The authors are neither responsible for the content nor the meaning of these images.
By using the WikiArt dataset, you agree to obey the terms and conditions of WikiArt.org.
### Contributions
[`gigant`](https://huggingface.co/gigant) added this dataset to the hub. | williamberman/wikiart | [
"task_categories:image-classification",
"task_categories:text-to-image",
"task_categories:image-to-text",
"size_categories:10K<n<100K",
"license:unknown",
"art",
"region:us"
]
| 2023-12-07T06:01:30+00:00 | {"license": "unknown", "size_categories": ["10K<n<100K"], "task_categories": ["image-classification", "text-to-image", "image-to-text"], "license_details": "Data files \u00a9 Original Authors", "tags": ["art"]} | 2023-12-07T06:14:03+00:00 | []
| []
| TAGS
#task_categories-image-classification #task_categories-text-to-image #task_categories-image-to-text #size_categories-10K<n<100K #license-unknown #art #region-us
|
## Dataset Description
- Homepage: URL
### Dataset Summary
Dataset containing 81,444 pieces of visual art from various artists, taken from URL,
along with class labels for each image :
* "artist" : 129 artist classes, including a "Unknown Artist" class
* "genre" : 11 genre classes, including a "Unknown Genre" class
* "style" : 27 style classes
On URL, the description for the "Artworks by Genre" page reads :
A genre system divides artworks according to depicted themes and objects. A classical hierarchy of genres was developed in European culture by the 17th century. It ranked genres in high – history painting and portrait, - and low – genre painting, landscape and still life. This hierarchy was based on the notion of man as the measure of all things. Landscape and still life were the lowest because they did not involve human subject matter. History was highest because it dealt with the noblest events of humanity. Genre system is not so much relevant for a contemporary art; there are just two genre definitions that are usually applied to it: abstract or figurative.
The "Artworks by Style" page reads :
A style of an artwork refers to its distinctive visual elements, techniques and methods. It usually corresponds with an art movement or a school (group) that its author is associated with.
## Dataset Structure
* "image" : image
* "artist" : 129 artist classes, including a "Unknown Artist" class
* "genre" : 11 genre classes, including a "Unknown Genre" class
* "style" : 27 style classes
### Source Data
Files taken from this archive, curated from the WikiArt website.
## Additional Information
Note:
* The WikiArt dataset can be used only for non-commercial research purpose.
* The images in the WikiArt dataset were obtained from URL.
* The authors are neither responsible for the content nor the meaning of these images.
By using the WikiArt dataset, you agree to obey the terms and conditions of URL.
### Contributions
'gigant' added this dataset to the hub. | [
"## Dataset Description\n\n- Homepage: URL",
"### Dataset Summary\n\nDataset containing 81,444 pieces of visual art from various artists, taken from URL,\nalong with class labels for each image :\n\n* \"artist\" : 129 artist classes, including a \"Unknown Artist\" class\n* \"genre\" : 11 genre classes, including a \"Unknown Genre\" class\n* \"style\" : 27 style classes\n\nOn URL, the description for the \"Artworks by Genre\" page reads :\n\nA genre system divides artworks according to depicted themes and objects. A classical hierarchy of genres was developed in European culture by the 17th century. It ranked genres in high – history painting and portrait, - and low – genre painting, landscape and still life. This hierarchy was based on the notion of man as the measure of all things. Landscape and still life were the lowest because they did not involve human subject matter. History was highest because it dealt with the noblest events of humanity. Genre system is not so much relevant for a contemporary art; there are just two genre definitions that are usually applied to it: abstract or figurative.\n\nThe \"Artworks by Style\" page reads :\n\nA style of an artwork refers to its distinctive visual elements, techniques and methods. It usually corresponds with an art movement or a school (group) that its author is associated with.",
"## Dataset Structure\n\n* \"image\" : image\n* \"artist\" : 129 artist classes, including a \"Unknown Artist\" class\n* \"genre\" : 11 genre classes, including a \"Unknown Genre\" class\n* \"style\" : 27 style classes",
"### Source Data\n\nFiles taken from this archive, curated from the WikiArt website.",
"## Additional Information\n\nNote:\n\n* The WikiArt dataset can be used only for non-commercial research purpose.\n* The images in the WikiArt dataset were obtained from URL.\n* The authors are neither responsible for the content nor the meaning of these images.\n\nBy using the WikiArt dataset, you agree to obey the terms and conditions of URL.",
"### Contributions\n\n'gigant' added this dataset to the hub."
]
| [
"TAGS\n#task_categories-image-classification #task_categories-text-to-image #task_categories-image-to-text #size_categories-10K<n<100K #license-unknown #art #region-us \n",
"## Dataset Description\n\n- Homepage: URL",
"### Dataset Summary\n\nDataset containing 81,444 pieces of visual art from various artists, taken from URL,\nalong with class labels for each image :\n\n* \"artist\" : 129 artist classes, including a \"Unknown Artist\" class\n* \"genre\" : 11 genre classes, including a \"Unknown Genre\" class\n* \"style\" : 27 style classes\n\nOn URL, the description for the \"Artworks by Genre\" page reads :\n\nA genre system divides artworks according to depicted themes and objects. A classical hierarchy of genres was developed in European culture by the 17th century. It ranked genres in high – history painting and portrait, - and low – genre painting, landscape and still life. This hierarchy was based on the notion of man as the measure of all things. Landscape and still life were the lowest because they did not involve human subject matter. History was highest because it dealt with the noblest events of humanity. Genre system is not so much relevant for a contemporary art; there are just two genre definitions that are usually applied to it: abstract or figurative.\n\nThe \"Artworks by Style\" page reads :\n\nA style of an artwork refers to its distinctive visual elements, techniques and methods. It usually corresponds with an art movement or a school (group) that its author is associated with.",
"## Dataset Structure\n\n* \"image\" : image\n* \"artist\" : 129 artist classes, including a \"Unknown Artist\" class\n* \"genre\" : 11 genre classes, including a \"Unknown Genre\" class\n* \"style\" : 27 style classes",
"### Source Data\n\nFiles taken from this archive, curated from the WikiArt website.",
"## Additional Information\n\nNote:\n\n* The WikiArt dataset can be used only for non-commercial research purpose.\n* The images in the WikiArt dataset were obtained from URL.\n* The authors are neither responsible for the content nor the meaning of these images.\n\nBy using the WikiArt dataset, you agree to obey the terms and conditions of URL.",
"### Contributions\n\n'gigant' added this dataset to the hub."
]
|
b9c7b762fa65a8d94dddf7e9ca052ac5fd765fe7 | License/Credit goes to the original creators of the dataset.
This is a sample dataset used for instruction fine tuning of LLMs | dolo650/lamini_docs_processed | [
"license:apache-2.0",
"region:us"
]
| 2023-12-07T06:06:22+00:00 | {"license": "apache-2.0"} | 2023-12-07T06:13:00+00:00 | []
| []
| TAGS
#license-apache-2.0 #region-us
| License/Credit goes to the original creators of the dataset.
This is a sample dataset used for instruction fine tuning of LLMs | []
| [
"TAGS\n#license-apache-2.0 #region-us \n"
]
|
e8132675138255269f2ed0946acc2876bdf828a2 | # ScreenEval
contact: Barrett Lattimer, [email protected] \
paper: [Fast and Accurate Factual Inconsistency Detection Over Long Documents](https://arxiv.org/abs/2310.13189) \
github: [scale-score](https://github.com/asappresearch/scale-score)
ScreenEval is a a novel dataset designed for factual inconsistency detection in long dialogues.
52 TV transcripts were summarized by humans, Longformer, and GPT-4, then each summary sentence was labelled for factual consistency with the source TV transcript.
Additionally, if a summary sentence was factually consistent, labellers provided relevant utterance support in the source document.
ScreenEval is the longest dialogue based dataset by tokens for factual inconsistency detection available to date.
The dialogue domain presents unique challenges such as long-distance coreference resolution and significant noise between relevant utterances.
ScreenEval is the dataset proposed in the paper "Fast and Accurate Factual Inconsistency Detection Over Long Documents" from EMNLP2023.
## Stats at a glance
- 52 TV transcripts
- \>6k tokens per TV transcript
- 624 summary sentences in total (from humans, Longformer, and GPT-4)
- Relevant utterance labels for all factually consistent summary sentences
## Arguments
The following keys can be used to access the different part of the ScreenEval dataset.
| Key | Type | Description |
| ------ | ------ | ------ |
| original_convo | List[str] | The source document that is to be summarized as a string |
| convo | List[List[str]] | The source document that is to be summarized split into a list of utterances |
| inferred_summary | List[str] | The summary sentence that is paired with the given source document |
| summary_id | List[str] | The source model for the summary sentence |
| convo_id | List[int] | The ID of the source document |
| annotated_summary | List[str] | The entire associated summary, with the focus summary sentence surrounded by `<mark><\mark>`|
| prediction_annotated_source_doc | List[str] | Raw source document |
| agreement | List[float] | Annotator agreement on summary sentence facutal inconsistency label |
| agg_label | List[bool] | Factual inconsistency label (true -> factually consistent, false -> factually inconsistent) |
| rel_utt | List[List[int]] | The indices of related utterances in the corresponding `convo` list. | blattimer/ScreenEval | [
"license:mit",
"arxiv:2310.13189",
"region:us"
]
| 2023-12-07T06:33:52+00:00 | {"license": "mit", "viewer": false} | 2023-12-07T07:14:55+00:00 | [
"2310.13189"
]
| []
| TAGS
#license-mit #arxiv-2310.13189 #region-us
| ScreenEval
==========
contact: Barrett Lattimer, blattimer@URL
paper: Fast and Accurate Factual Inconsistency Detection Over Long Documents
github: scale-score
ScreenEval is a a novel dataset designed for factual inconsistency detection in long dialogues.
52 TV transcripts were summarized by humans, Longformer, and GPT-4, then each summary sentence was labelled for factual consistency with the source TV transcript.
Additionally, if a summary sentence was factually consistent, labellers provided relevant utterance support in the source document.
ScreenEval is the longest dialogue based dataset by tokens for factual inconsistency detection available to date.
The dialogue domain presents unique challenges such as long-distance coreference resolution and significant noise between relevant utterances.
ScreenEval is the dataset proposed in the paper "Fast and Accurate Factual Inconsistency Detection Over Long Documents" from EMNLP2023.
Stats at a glance
-----------------
* 52 TV transcripts
* >6k tokens per TV transcript
* 624 summary sentences in total (from humans, Longformer, and GPT-4)
* Relevant utterance labels for all factually consistent summary sentences
Arguments
---------
The following keys can be used to access the different part of the ScreenEval dataset.
Key: original\_convo, Type: List[str], Description: The source document that is to be summarized as a string
Key: convo, Type: List[List[str]], Description: The source document that is to be summarized split into a list of utterances
Key: inferred\_summary, Type: List[str], Description: The summary sentence that is paired with the given source document
Key: summary\_id, Type: List[str], Description: The source model for the summary sentence
Key: convo\_id, Type: List[int], Description: The ID of the source document
Key: annotated\_summary, Type: List[str], Description: The entire associated summary, with the focus summary sentence surrounded by '<\mark>'
Key: prediction\_annotated\_source\_doc, Type: List[str], Description: Raw source document
Key: agreement, Type: List[float], Description: Annotator agreement on summary sentence facutal inconsistency label
Key: agg\_label, Type: List[bool], Description: Factual inconsistency label (true -> factually consistent, false -> factually inconsistent)
Key: rel\_utt, Type: List[List[int]], Description: The indices of related utterances in the corresponding 'convo' list.
| []
| [
"TAGS\n#license-mit #arxiv-2310.13189 #region-us \n"
]
|
9aaf12381979220bea352b8291be66bb826c57bf | # Dataset Card for "tweet-test-trial"
sample dataset of length 60000 | Dhairya/tweet-test-trial | [
"region:us"
]
| 2023-12-07T06:40:51+00:00 | {"dataset_info": {"features": [{"name": "date", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "username", "dtype": "string"}, {"name": "media", "dtype": "string"}, {"name": "inferred company", "dtype": "string"}, {"name": "bytes", "dtype": "image"}, {"name": "likes", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 41703660, "num_examples": 60000, "dataset_name": "tweet-test-trial"}]}} | 2023-12-07T06:56:24+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "tweet-test-trial"
sample dataset of length 60000 | [
"# Dataset Card for \"tweet-test-trial\"\n\nsample dataset of length 60000"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"tweet-test-trial\"\n\nsample dataset of length 60000"
]
|
85133c8f904252315dd9a4333077fa940965234a |
# Dataset of Yagami Kou
This is the dataset of Yagami Kou, containing 460 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 460 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 1090 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 1287 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 460 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 460 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 460 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 1090 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 1090 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 975 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 1287 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 1287 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| CyberHarem/yagami_kou_newgame | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
]
| 2023-12-07T06:43:33+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-07T07:07:09+00:00 | []
| []
| TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of Yagami Kou
=====================
This is the dataset of Yagami Kou, containing 460 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
| []
| [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
]
|
de6d8b6ff9b8837d272113536fec1954d0cbdbaf |
# Dataset Card for Background Summarization of Event Timelines
This dataset provides background text summaries for news events timelines.
## Dataset Details
### Dataset Description
Generating concise summaries of news events is a challenging natural language processing task. While journalists often curate timelines to highlight key sub-events, newcomers to a news event face challenges in catching up on its historical context. This dataset addresses this need by introducing the task of background news summarization, which complements each timeline update with a background summary of relevant preceding events. This dataset includes human-annotated backgrounds for 14 major news events from 2005--2014.
- **Curated by:** Adithya Pratapa, Kevin Small, Markus Dreyer
- **Language(s) (NLP):** English
- **License:** CC-BY-NC-4.0
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://github.com/amazon-science/background-summaries
- **Paper:** https://arxiv.org/abs/2310.16197
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
This dataset can be used for training text summarization systems. The trained systems would be capable of generating background (historical context) to a news update. To generate the background, the system takes past news updates as input.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
Systems trained on this dataset might not perform as expected on domains other than newswire. To avoid factual errors, system-generated summaries should be verified by experts before deploying in real-world.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
### Dataset Fields
| Field | Name | Description |
| :--- | :--- | :--- |
| src | Source | Concatenated string of all the previous updates. Each update text includes the publication date. |
| z | Guidance | Update text for the current timestep. |
| tgt | Target | Background text for the current timestep. |
### Data Splits
An overview of the major events and their splits in this dataset. The last column provides the statistics for background annotations provided in this dataset.
| Split | Major event | Sources (# timelines) | Time period | # updates | len(updates) | len(background) |
| :--- | :--- | ---: | ---: | ---: | ---: | ---: |
| Train | Swine flu | T17 (3) | 2009 | 21 | 52 | 45 |
| Train | Financial crisis | T17 (1) | 2008 | 65 | 115 | 147 |
| Train | Iraq war | T17 (1) | 2005 | 155 | 41 | 162 |
| Validation | Haitian earthquake | T17 (1) | 2010 | 11 | 100 | 61 |
| Validation | Michael Jackson death | T17 (1) | 2009--2011 | 37 | 36 | 164 |
| Validation | BP oil spill | T17 (5) | 2010--2012 | 118 | 56 | 219 |
| Test | NSA leak | SocialTimeline (1) | 2014 | 29 | 45 | 50 |
| Test | Gaza conflict | SocialTimeline (1) | 2014 | 38 | 183 | 263 |
| Test | MH370 flight disappearance | SocialTimeline (1) | 2014 | 39 | 39 | 127 |
| Test | Yemen crisis | Crisis (6) | 2011--2012 | 81 | 30 | 125 |
| Test | Russian-Ukraine conflict | SocialTimeline (3) | 2014 | 86 | 112 | 236 |
| Test | Libyan crisis | T17 (2); Crisis (7) | 2011 | 118 | 38 | 177 |
| Test | Egyptian crisis | T17 (1); Crisis (4) | 2011--2013 | 129 | 34 | 187 |
| Test | Syrian crisis | T17 (4); Crisis (5) | 2011--2013 | 164 | 30 | 162 |
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
Readers often find it difficult to keep track of complex news events. A background summary that provides sufficient historical context can help improve the reader's understanding of a news update. This dataset provides human-annotated backgrounds for development and evaluation of background summarization systems.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
This dataset is built upon three popular news timeline summarization datasets, Timeline17 ([Binh Tran et al., 2013](https://dl.acm.org/doi/10.1145/2487788.2487829)), Crisis ([Tran et al., 2015](https://link.springer.com/chapter/10.1007/978-3-319-16354-3_26)), and Social Timeline ([Wang et al., 2015](https://aclanthology.org/N15-1112/)).
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
__Timeline17:__ compiled from an ensemble of news websites, this dataset provides 17 timelines spanning 9 major events from 2005--2013.
__Crisis:__ a follow-up to the Timeline17 dataset, this covers 25 timelines spanning 4 major events. While it mostly covers a subset of events from Timeline17, it adds a new event (the Yemen crisis).
__Social Timeline:__ compiled 6 timelines covering 4 major events from 2014. The timelines were collected from Wikipedia, NYTimes, and BBC.
### Annotations
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
Timelines were originally collected from various news websites (CNN, BBC, NYTimes, etc.), many events have more than one timeline. Since each timeline covers the same underlying event, we merge them using timestamps to create a single timeline per event. During this merging process, we often end up with more than one update text per timestamp with possibly duplicate content. We ask the annotators to first rewrite the input updates to remove any duplicate content. Our annotation process for each news event contains the following three steps:
1. Read the input timeline to get a high-level understanding of the event.
2. For each timestep, read the provided 'rough' update summary. Rewrite the update into a short paragraph, removing any duplicate or previously reported subevents.
3. Go through the timeline in a sequential manner and write a background summary for each timestep.
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
We hired three professional annotators. For each timeline, we collect three independent (rewritten) update and (new) background pairs.
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
To the best of our knowledge, there is no personal or sensitive information in this dataset.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
### Limitations
__Personalized Backgrounds:__ While a background summary can be useful to any news reader, the utility can vary depending on the reader's familiarity with the event. This dataset doesn't include any backgrounds customized to individual readers.
__Local Events:__ This dataset is limited to globally popular events involving disasters and conflicts. We leave the task of collecting background summaries for local events to future work.
__Background from News Articles:__ Background summaries can also be generated directly from news articles. In this dataset, we only consider background summaries based on past news updates. We leave the extension to news articles to future work.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
__BibTeX:__
```bibtex
@article{pratapa-etal-2023-background,
title = {Background Summarization of Event Timelines},
author = {Pratapa, Adithya and Small, Kevin and Dreyer, Markus},
publisher = {EMNLP},
year = {2023},
url = {https://arxiv.org/abs/2310.16197},
}
```
## Glossary
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
__Major event:__ the key news story for which we are constructing a timeline. For instance, 'Egyptian Crisis', 'BP oil spill', 'MH 370 disappearance' are some of the super events from our dataset.
__Timeline:__ a series of timesteps. Each timestep in a timeline is associated with an update and a background summary.
__Timestep:__ day of the event (`yyyy-mm-dd`).
__Update:__ a short text summary of _what's new_ in the news story. This text summarizes the latest events, specifically ones that are important to the overall story.
__Background:__ a short text summary that provides _sufficient historical context_ for the current update. Background aims to provide the reader a quick history of the news story, without them having to read all the previous updates. Background should cover past events that help in understanding the current events described in the update.
## Dataset Card Authors
Adithya Pratapa, Kevin Small, Markus Dreyer
## Dataset Card Contact
[Adithya Pratapa](https://apratapa.xyz)
| adithya7/background-summaries | [
"task_categories:summarization",
"annotations_creators:expert-generated",
"language_creators:expert-generated",
"size_categories:1K<n<10K",
"source_datasets:Timeline17",
"source_datasets:Crisis",
"source_datasets:SocialTimeline",
"language:en",
"license:cc-by-nc-4.0",
"summarization",
"event-summarization",
"background-summarization",
"arxiv:2310.16197",
"region:us"
]
| 2023-12-07T07:05:24+00:00 | {"annotations_creators": ["expert-generated"], "language_creators": ["expert-generated"], "language": ["en"], "license": "cc-by-nc-4.0", "size_categories": ["1K<n<10K"], "source_datasets": ["Timeline17", "Crisis", "SocialTimeline"], "task_categories": ["summarization"], "pretty_name": "Background Summarization", "tags": ["summarization", "event-summarization", "background-summarization"]} | 2023-12-07T07:08:51+00:00 | [
"2310.16197"
]
| [
"en"
]
| TAGS
#task_categories-summarization #annotations_creators-expert-generated #language_creators-expert-generated #size_categories-1K<n<10K #source_datasets-Timeline17 #source_datasets-Crisis #source_datasets-SocialTimeline #language-English #license-cc-by-nc-4.0 #summarization #event-summarization #background-summarization #arxiv-2310.16197 #region-us
| Dataset Card for Background Summarization of Event Timelines
============================================================
This dataset provides background text summaries for news events timelines.
Dataset Details
---------------
### Dataset Description
Generating concise summaries of news events is a challenging natural language processing task. While journalists often curate timelines to highlight key sub-events, newcomers to a news event face challenges in catching up on its historical context. This dataset addresses this need by introducing the task of background news summarization, which complements each timeline update with a background summary of relevant preceding events. This dataset includes human-annotated backgrounds for 14 major news events from 2005--2014.
* Curated by: Adithya Pratapa, Kevin Small, Markus Dreyer
* Language(s) (NLP): English
* License: CC-BY-NC-4.0
### Dataset Sources
* Repository: URL
* Paper: URL
Uses
----
### Direct Use
This dataset can be used for training text summarization systems. The trained systems would be capable of generating background (historical context) to a news update. To generate the background, the system takes past news updates as input.
### Out-of-Scope Use
Systems trained on this dataset might not perform as expected on domains other than newswire. To avoid factual errors, system-generated summaries should be verified by experts before deploying in real-world.
Dataset Structure
-----------------
### Dataset Fields
### Data Splits
An overview of the major events and their splits in this dataset. The last column provides the statistics for background annotations provided in this dataset.
Dataset Creation
----------------
### Curation Rationale
Readers often find it difficult to keep track of complex news events. A background summary that provides sufficient historical context can help improve the reader's understanding of a news update. This dataset provides human-annotated backgrounds for development and evaluation of background summarization systems.
### Source Data
#### Data Collection and Processing
This dataset is built upon three popular news timeline summarization datasets, Timeline17 (Binh Tran et al., 2013), Crisis (Tran et al., 2015), and Social Timeline (Wang et al., 2015).
#### Who are the source data producers?
**Timeline17:** compiled from an ensemble of news websites, this dataset provides 17 timelines spanning 9 major events from 2005--2013.
**Crisis:** a follow-up to the Timeline17 dataset, this covers 25 timelines spanning 4 major events. While it mostly covers a subset of events from Timeline17, it adds a new event (the Yemen crisis).
**Social Timeline:** compiled 6 timelines covering 4 major events from 2014. The timelines were collected from Wikipedia, NYTimes, and BBC.
### Annotations
#### Annotation process
Timelines were originally collected from various news websites (CNN, BBC, NYTimes, etc.), many events have more than one timeline. Since each timeline covers the same underlying event, we merge them using timestamps to create a single timeline per event. During this merging process, we often end up with more than one update text per timestamp with possibly duplicate content. We ask the annotators to first rewrite the input updates to remove any duplicate content. Our annotation process for each news event contains the following three steps:
1. Read the input timeline to get a high-level understanding of the event.
2. For each timestep, read the provided 'rough' update summary. Rewrite the update into a short paragraph, removing any duplicate or previously reported subevents.
3. Go through the timeline in a sequential manner and write a background summary for each timestep.
#### Who are the annotators?
We hired three professional annotators. For each timeline, we collect three independent (rewritten) update and (new) background pairs.
#### Personal and Sensitive Information
To the best of our knowledge, there is no personal or sensitive information in this dataset.
Bias, Risks, and Limitations
----------------------------
### Limitations
**Personalized Backgrounds:** While a background summary can be useful to any news reader, the utility can vary depending on the reader's familiarity with the event. This dataset doesn't include any backgrounds customized to individual readers.
**Local Events:** This dataset is limited to globally popular events involving disasters and conflicts. We leave the task of collecting background summaries for local events to future work.
**Background from News Articles:** Background summaries can also be generated directly from news articles. In this dataset, we only consider background summaries based on past news updates. We leave the extension to news articles to future work.
**BibTeX:**
Glossary
--------
**Major event:** the key news story for which we are constructing a timeline. For instance, 'Egyptian Crisis', 'BP oil spill', 'MH 370 disappearance' are some of the super events from our dataset.
**Timeline:** a series of timesteps. Each timestep in a timeline is associated with an update and a background summary.
**Timestep:** day of the event ('yyyy-mm-dd').
**Update:** a short text summary of *what's new* in the news story. This text summarizes the latest events, specifically ones that are important to the overall story.
**Background:** a short text summary that provides *sufficient historical context* for the current update. Background aims to provide the reader a quick history of the news story, without them having to read all the previous updates. Background should cover past events that help in understanding the current events described in the update.
Dataset Card Authors
--------------------
Adithya Pratapa, Kevin Small, Markus Dreyer
Dataset Card Contact
--------------------
Adithya Pratapa
| [
"### Dataset Description\n\n\nGenerating concise summaries of news events is a challenging natural language processing task. While journalists often curate timelines to highlight key sub-events, newcomers to a news event face challenges in catching up on its historical context. This dataset addresses this need by introducing the task of background news summarization, which complements each timeline update with a background summary of relevant preceding events. This dataset includes human-annotated backgrounds for 14 major news events from 2005--2014.\n\n\n* Curated by: Adithya Pratapa, Kevin Small, Markus Dreyer\n* Language(s) (NLP): English\n* License: CC-BY-NC-4.0",
"### Dataset Sources\n\n\n* Repository: URL\n* Paper: URL\n\n\nUses\n----",
"### Direct Use\n\n\nThis dataset can be used for training text summarization systems. The trained systems would be capable of generating background (historical context) to a news update. To generate the background, the system takes past news updates as input.",
"### Out-of-Scope Use\n\n\nSystems trained on this dataset might not perform as expected on domains other than newswire. To avoid factual errors, system-generated summaries should be verified by experts before deploying in real-world.\n\n\nDataset Structure\n-----------------",
"### Dataset Fields",
"### Data Splits\n\n\nAn overview of the major events and their splits in this dataset. The last column provides the statistics for background annotations provided in this dataset.\n\n\n\nDataset Creation\n----------------",
"### Curation Rationale\n\n\nReaders often find it difficult to keep track of complex news events. A background summary that provides sufficient historical context can help improve the reader's understanding of a news update. This dataset provides human-annotated backgrounds for development and evaluation of background summarization systems.",
"### Source Data",
"#### Data Collection and Processing\n\n\nThis dataset is built upon three popular news timeline summarization datasets, Timeline17 (Binh Tran et al., 2013), Crisis (Tran et al., 2015), and Social Timeline (Wang et al., 2015).",
"#### Who are the source data producers?\n\n\n**Timeline17:** compiled from an ensemble of news websites, this dataset provides 17 timelines spanning 9 major events from 2005--2013.\n\n\n**Crisis:** a follow-up to the Timeline17 dataset, this covers 25 timelines spanning 4 major events. While it mostly covers a subset of events from Timeline17, it adds a new event (the Yemen crisis).\n\n\n**Social Timeline:** compiled 6 timelines covering 4 major events from 2014. The timelines were collected from Wikipedia, NYTimes, and BBC.",
"### Annotations",
"#### Annotation process\n\n\nTimelines were originally collected from various news websites (CNN, BBC, NYTimes, etc.), many events have more than one timeline. Since each timeline covers the same underlying event, we merge them using timestamps to create a single timeline per event. During this merging process, we often end up with more than one update text per timestamp with possibly duplicate content. We ask the annotators to first rewrite the input updates to remove any duplicate content. Our annotation process for each news event contains the following three steps:\n\n\n1. Read the input timeline to get a high-level understanding of the event.\n2. For each timestep, read the provided 'rough' update summary. Rewrite the update into a short paragraph, removing any duplicate or previously reported subevents.\n3. Go through the timeline in a sequential manner and write a background summary for each timestep.",
"#### Who are the annotators?\n\n\nWe hired three professional annotators. For each timeline, we collect three independent (rewritten) update and (new) background pairs.",
"#### Personal and Sensitive Information\n\n\nTo the best of our knowledge, there is no personal or sensitive information in this dataset.\n\n\nBias, Risks, and Limitations\n----------------------------",
"### Limitations\n\n\n**Personalized Backgrounds:** While a background summary can be useful to any news reader, the utility can vary depending on the reader's familiarity with the event. This dataset doesn't include any backgrounds customized to individual readers.\n\n\n**Local Events:** This dataset is limited to globally popular events involving disasters and conflicts. We leave the task of collecting background summaries for local events to future work.\n\n\n**Background from News Articles:** Background summaries can also be generated directly from news articles. In this dataset, we only consider background summaries based on past news updates. We leave the extension to news articles to future work.\n\n\n**BibTeX:**\n\n\nGlossary\n--------\n\n\n**Major event:** the key news story for which we are constructing a timeline. For instance, 'Egyptian Crisis', 'BP oil spill', 'MH 370 disappearance' are some of the super events from our dataset.\n\n\n**Timeline:** a series of timesteps. Each timestep in a timeline is associated with an update and a background summary.\n\n\n**Timestep:** day of the event ('yyyy-mm-dd').\n\n\n**Update:** a short text summary of *what's new* in the news story. This text summarizes the latest events, specifically ones that are important to the overall story.\n\n\n**Background:** a short text summary that provides *sufficient historical context* for the current update. Background aims to provide the reader a quick history of the news story, without them having to read all the previous updates. Background should cover past events that help in understanding the current events described in the update.\n\n\nDataset Card Authors\n--------------------\n\n\nAdithya Pratapa, Kevin Small, Markus Dreyer\n\n\nDataset Card Contact\n--------------------\n\n\nAdithya Pratapa"
]
| [
"TAGS\n#task_categories-summarization #annotations_creators-expert-generated #language_creators-expert-generated #size_categories-1K<n<10K #source_datasets-Timeline17 #source_datasets-Crisis #source_datasets-SocialTimeline #language-English #license-cc-by-nc-4.0 #summarization #event-summarization #background-summarization #arxiv-2310.16197 #region-us \n",
"### Dataset Description\n\n\nGenerating concise summaries of news events is a challenging natural language processing task. While journalists often curate timelines to highlight key sub-events, newcomers to a news event face challenges in catching up on its historical context. This dataset addresses this need by introducing the task of background news summarization, which complements each timeline update with a background summary of relevant preceding events. This dataset includes human-annotated backgrounds for 14 major news events from 2005--2014.\n\n\n* Curated by: Adithya Pratapa, Kevin Small, Markus Dreyer\n* Language(s) (NLP): English\n* License: CC-BY-NC-4.0",
"### Dataset Sources\n\n\n* Repository: URL\n* Paper: URL\n\n\nUses\n----",
"### Direct Use\n\n\nThis dataset can be used for training text summarization systems. The trained systems would be capable of generating background (historical context) to a news update. To generate the background, the system takes past news updates as input.",
"### Out-of-Scope Use\n\n\nSystems trained on this dataset might not perform as expected on domains other than newswire. To avoid factual errors, system-generated summaries should be verified by experts before deploying in real-world.\n\n\nDataset Structure\n-----------------",
"### Dataset Fields",
"### Data Splits\n\n\nAn overview of the major events and their splits in this dataset. The last column provides the statistics for background annotations provided in this dataset.\n\n\n\nDataset Creation\n----------------",
"### Curation Rationale\n\n\nReaders often find it difficult to keep track of complex news events. A background summary that provides sufficient historical context can help improve the reader's understanding of a news update. This dataset provides human-annotated backgrounds for development and evaluation of background summarization systems.",
"### Source Data",
"#### Data Collection and Processing\n\n\nThis dataset is built upon three popular news timeline summarization datasets, Timeline17 (Binh Tran et al., 2013), Crisis (Tran et al., 2015), and Social Timeline (Wang et al., 2015).",
"#### Who are the source data producers?\n\n\n**Timeline17:** compiled from an ensemble of news websites, this dataset provides 17 timelines spanning 9 major events from 2005--2013.\n\n\n**Crisis:** a follow-up to the Timeline17 dataset, this covers 25 timelines spanning 4 major events. While it mostly covers a subset of events from Timeline17, it adds a new event (the Yemen crisis).\n\n\n**Social Timeline:** compiled 6 timelines covering 4 major events from 2014. The timelines were collected from Wikipedia, NYTimes, and BBC.",
"### Annotations",
"#### Annotation process\n\n\nTimelines were originally collected from various news websites (CNN, BBC, NYTimes, etc.), many events have more than one timeline. Since each timeline covers the same underlying event, we merge them using timestamps to create a single timeline per event. During this merging process, we often end up with more than one update text per timestamp with possibly duplicate content. We ask the annotators to first rewrite the input updates to remove any duplicate content. Our annotation process for each news event contains the following three steps:\n\n\n1. Read the input timeline to get a high-level understanding of the event.\n2. For each timestep, read the provided 'rough' update summary. Rewrite the update into a short paragraph, removing any duplicate or previously reported subevents.\n3. Go through the timeline in a sequential manner and write a background summary for each timestep.",
"#### Who are the annotators?\n\n\nWe hired three professional annotators. For each timeline, we collect three independent (rewritten) update and (new) background pairs.",
"#### Personal and Sensitive Information\n\n\nTo the best of our knowledge, there is no personal or sensitive information in this dataset.\n\n\nBias, Risks, and Limitations\n----------------------------",
"### Limitations\n\n\n**Personalized Backgrounds:** While a background summary can be useful to any news reader, the utility can vary depending on the reader's familiarity with the event. This dataset doesn't include any backgrounds customized to individual readers.\n\n\n**Local Events:** This dataset is limited to globally popular events involving disasters and conflicts. We leave the task of collecting background summaries for local events to future work.\n\n\n**Background from News Articles:** Background summaries can also be generated directly from news articles. In this dataset, we only consider background summaries based on past news updates. We leave the extension to news articles to future work.\n\n\n**BibTeX:**\n\n\nGlossary\n--------\n\n\n**Major event:** the key news story for which we are constructing a timeline. For instance, 'Egyptian Crisis', 'BP oil spill', 'MH 370 disappearance' are some of the super events from our dataset.\n\n\n**Timeline:** a series of timesteps. Each timestep in a timeline is associated with an update and a background summary.\n\n\n**Timestep:** day of the event ('yyyy-mm-dd').\n\n\n**Update:** a short text summary of *what's new* in the news story. This text summarizes the latest events, specifically ones that are important to the overall story.\n\n\n**Background:** a short text summary that provides *sufficient historical context* for the current update. Background aims to provide the reader a quick history of the news story, without them having to read all the previous updates. Background should cover past events that help in understanding the current events described in the update.\n\n\nDataset Card Authors\n--------------------\n\n\nAdithya Pratapa, Kevin Small, Markus Dreyer\n\n\nDataset Card Contact\n--------------------\n\n\nAdithya Pratapa"
]
|
7c51eb5e89f2ef623f7e89112382dad01087a887 | # Dataset Card for "hs_dev_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibragim-bad/hs_dev_test | [
"region:us"
]
| 2023-12-07T07:30:50+00:00 | {"dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "gold_generation", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 7885767, "num_examples": 10003}, {"name": "dev", "num_bytes": 9610103, "num_examples": 10042}], "download_size": 10451785, "dataset_size": 17495870}} | 2023-12-07T07:31:00+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "hs_dev_test"
More Information needed | [
"# Dataset Card for \"hs_dev_test\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"hs_dev_test\"\n\nMore Information needed"
]
|
667f7ad4a0e24cb5895e3e76d03a8961813899bd | ## About
This is a curated subset of 3 representative samples per class (77 classes in total) for the Banking77 dataset, as collected by a domain expert.
It was used in the paper "Making LLMs Worth Every Penny: Resource-Limited Text Classification in Banking", published in ACM ICAIF 2023 (https://arxiv.org/abs/2311.06102).
Our findings show that Few-Shot Text Classification on representative samples are better than randomly selected samples.
## Citation
```
@inproceedings{10.1145/3604237.3626891,
author = {Loukas, Lefteris and Stogiannidis, Ilias and Diamantopoulos, Odysseas and Malakasiotis, Prodromos and Vassos, Stavros},
title = {Making LLMs Worth Every Penny: Resource-Limited Text Classification in Banking},
year = {2023},
isbn = {9798400702402},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3604237.3626891},
doi = {10.1145/3604237.3626891},
pages = {392–400},
numpages = {9},
keywords = {Anthropic, Cohere, OpenAI, LLMs, NLP, Claude, GPT, Few-shot},
location = {Brooklyn, NY, USA},
series = {ICAIF '23}
}
```
---
language:
- en
Tags:
- banking77
- classification
- conversational
--- | helvia/banking77-representative-samples | [
"arxiv:2311.06102",
"region:us"
]
| 2023-12-07T07:40:52+00:00 | {} | 2024-02-13T15:45:17+00:00 | [
"2311.06102"
]
| []
| TAGS
#arxiv-2311.06102 #region-us
| ## About
This is a curated subset of 3 representative samples per class (77 classes in total) for the Banking77 dataset, as collected by a domain expert.
It was used in the paper "Making LLMs Worth Every Penny: Resource-Limited Text Classification in Banking", published in ACM ICAIF 2023 (URL
Our findings show that Few-Shot Text Classification on representative samples are better than randomly selected samples.
---
language:
- en
Tags:
- banking77
- classification
- conversational
--- | [
"## About\nThis is a curated subset of 3 representative samples per class (77 classes in total) for the Banking77 dataset, as collected by a domain expert.\nIt was used in the paper \"Making LLMs Worth Every Penny: Resource-Limited Text Classification in Banking\", published in ACM ICAIF 2023 (URL\nOur findings show that Few-Shot Text Classification on representative samples are better than randomly selected samples.\n\n---\nlanguage:\n- en\nTags:\n- banking77\n- classification\n- conversational\n---"
]
| [
"TAGS\n#arxiv-2311.06102 #region-us \n",
"## About\nThis is a curated subset of 3 representative samples per class (77 classes in total) for the Banking77 dataset, as collected by a domain expert.\nIt was used in the paper \"Making LLMs Worth Every Penny: Resource-Limited Text Classification in Banking\", published in ACM ICAIF 2023 (URL\nOur findings show that Few-Shot Text Classification on representative samples are better than randomly selected samples.\n\n---\nlanguage:\n- en\nTags:\n- banking77\n- classification\n- conversational\n---"
]
|
541c31dc0d74815f9603444944b2dc496b49aec4 |
# Dataset of Takimoto Hifumi
This is the dataset of Takimoto Hifumi, containing 277 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 277 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 634 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 721 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 277 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 277 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 277 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 634 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 634 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 548 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 721 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 721 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| CyberHarem/takimoto_hifumi_newgame | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
]
| 2023-12-07T07:42:20+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-07T07:55:33+00:00 | []
| []
| TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of Takimoto Hifumi
==========================
This is the dataset of Takimoto Hifumi, containing 277 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
| []
| [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
]
|
4028e1c35cb723aebad8a724059c6a6793b95dd3 | # Dataset Card for "en_wiki_sentences_1000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | deokhk/en_wiki_sentences_1000000 | [
"region:us"
]
| 2023-12-07T08:02:56+00:00 | {"dataset_info": {"features": [{"name": "sentence", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 124980032, "num_examples": 1000000}, {"name": "dev", "num_bytes": 123586, "num_examples": 1000}], "download_size": 77463265, "dataset_size": 125103618}} | 2023-12-07T08:03:08+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "en_wiki_sentences_1000000"
More Information needed | [
"# Dataset Card for \"en_wiki_sentences_1000000\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"en_wiki_sentences_1000000\"\n\nMore Information needed"
]
|
5ab89a7e6b2d8ca6e82233aaafe2424e86af5556 | # Dataset Card for "ko_wiki_sentences_1000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | deokhk/ko_wiki_sentences_1000000 | [
"region:us"
]
| 2023-12-07T08:03:19+00:00 | {"dataset_info": {"features": [{"name": "sentence", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 175952266, "num_examples": 1000000}, {"name": "dev", "num_bytes": 166735, "num_examples": 1000}], "download_size": 106344214, "dataset_size": 176119001}} | 2023-12-07T08:03:34+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "ko_wiki_sentences_1000000"
More Information needed | [
"# Dataset Card for \"ko_wiki_sentences_1000000\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"ko_wiki_sentences_1000000\"\n\nMore Information needed"
]
|
1b1185018f8a16c9da5949459d223efdd16fe40d | # Dataset Card for "zh_wiki_sentences_1000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | deokhk/zh_wiki_sentences_1000000 | [
"region:us"
]
| 2023-12-07T08:03:43+00:00 | {"dataset_info": {"features": [{"name": "sentence", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 127836004, "num_examples": 1000000}, {"name": "dev", "num_bytes": 135625, "num_examples": 1000}], "download_size": 88011343, "dataset_size": 127971629}} | 2023-12-07T08:03:59+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "zh_wiki_sentences_1000000"
More Information needed | [
"# Dataset Card for \"zh_wiki_sentences_1000000\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"zh_wiki_sentences_1000000\"\n\nMore Information needed"
]
|
9943895e5c6ec7bf1f3febe10f23fcdc8d584570 |
## Dataset Description
- **Homepage: https://trusthlt.org/lacour**
- **Repository: https://github.com/trusthlt/lacour-corpus**
- **Paper: tbd**
### Dataset Summary
This dataset contains transcribed court hearings sourced from official hearings of the __European Court of Human Rights__ ([https://www.echr.coe.int/webcasts-of-hearings](https://www.echr.coe.int/webcasts-of-hearings)). The hearings are 154 selected webcasts (videos) from 2012-2022 in their original language (no interpretation). With manual annotation for language labels and automatic processing of the extracted audio with [pyannote](https://huggingface.co/pyannote/speaker-diarization) and [whisper-large-v2](https://huggingface.co/openai/whisper-large-v2), the resulting dataset contains 4000 speaker turns and 88920 individual lines. The dataset contains two subsets, the transcripts and the metadata with linked documents. The transcripts are additionally available as .txt or .xml.
### Languages
The largest amounts in the transcripts are:
English, French
A smaller portion also contains the following languages:
Russian, Spanish, Croatian, Italian, Portuguese, Turkish, Polish, Lithuanian, German, Ukrainian, Hungarian, Dutch, Albanian, Romanian, Serbian
The collected metadata is:
English
## Dataset Structure
### Data Instances
Each instance in transcripts represents an entire segment of a transcript, similar to a conversation turn in a dialog.
```
{
'id': 0,
'webcast_id': '1021112_29112017',
'segment_id': 0,
'speaker_name': 'UNK',
'speaker_role': 'Announcer',
'data': {
'begin': [12.479999542236328],
'end': [13.359999656677246],
'language': ['fr'],
'text': ['La Cour!']
}
}
```
Each instance in documents represents a information on a document in hudoc associated with a hearing and the metadata associated with a hearing. The actual document is linked and can also be found in [hudoc](https://hudoc.echr.coe.int) with the case_id. Note: `hearing_type` states the type of the hearing, `type` states the type of the document. If the hearing is a "Grand Chamber hearing", the "CHAMBER" document refers to a different hearing.
```
{
'id': 16,
'webcast_id': '1232311_02102012',
'hearing_title': 'Michaud v. France (nos. 12323/11)',
'hearing_date': '2012-10-02 00:00:00',
'hearing_type': 'Chamber hearing',
'application_number': ['12323/11'],
'case_id': '001-115377',
'case_name': 'CASE OF MICHAUD v. FRANCE',
'case_url': 'https://hudoc.echr.coe.int/eng?i=001-115377',
'ecli': 'ECLI:CE:ECHR:2012:1206JUD001232311',
'type': 'CHAMBER',
'document_date': '2012-12-06 00:00:00',
'importance': 1,
'articles': ['8', '8-1', '8-2', '34', '35'],
'respondent_government': ['FRA'],
'issue': 'Decision of the National Bar Council of 12 July 2007 “adopting regulations on internal procedures for implementing the obligation to combat money laundering and terrorist financing, and an internal supervisory mechanism to guarantee compliance with those procedures” ; Article 21-1 of the Law of 31 December 1971 ; Law no. 2004-130 of 11 February 2004 ; Monetary and Financial Code',
'strasbourg_caselaw': 'André and Other v. France, no 18603/03, 24 July 2008;Bosphorus Hava Yollari Turizm ve Ticaret Anonim Sirketi v. Ireland [GC], no 45036/98, ECHR 2005-VI;[...]',
'external_sources': 'Directive 91/308/EEC, 10 June 1991;Article 6 of the Treaty on European Union;Charter of Fundamental Rights of the European Union;Articles 169, 170, 173, 175, 177, 184 and 189 of the Treaty establishing the European Community;Recommendations 12 and 16 of the financial action task force (“FATF”) on money laundering;Council of Europe Convention on Laundering, Search, Seizure and Confiscation of the Proceeds from Crime and on the Financing of Terrorism (16 May 2005)',
'conclusion': 'Remainder inadmissible;No violation of Article 8 - Right to respect for private and family life (Article 8-1 - Respect for correspondence;Respect for private life)',
'separate_opinion': True
}
```
### Data Fields
transcripts:
* id: the identifier
* webcast_id: the identifier for the hearing
* segment_id: the identifier of the current speaker segment in the current hearing
* speaker_name: the name of the speaker (not given for Applicant, Government or Third Party)
* speaker_role: the role/party the speaker represents (`Announcer` for announcements, `Judge` for judges, `JudgeP` for judge president, `Applicant` for representatives of the applicant, `Government` for representatives of the respondent government, `ThirdParty` for representatives of third party interveners)
* data: sequence of the following fields
* begin: the timestamp for begin of line (in seconds)
* end: the timestamp for end of line (in seconds)
* language: the language spoken (in ISO 639-1)
* text: the spoken line
documents:
* id: the identifier
* webcast_id: the identifier for the hearing (allows linking to transcripts)
* hearing_title: the title of the hearing
* hearing_date: the date of the hearing
* hearing_type: the type of hearing (Grand Chamber, Chamber or Grand Chamber Judgment Hearing)
* application_number: the application numbers which are associated with the hearing and case
* case_id: the id of the case
* case_name: the name of the case
* case_url: the direct link to the document
* ecli: the ECLI (European Case Law Identifier)
* type: the type of the document
* document_date: the date of the document
* importance: the importance score of the case (1 is the highest importance, key case)
* articles: the concerning articles of the Convention of Human Rights
* respondent_government: the code of the respondent government(s) (in ISO-3166 Alpha-3)
* issue: the references to the issue of the case
* strasbourg_caselaw: the list of cases in the ECHR which are relevant to the current case
* external_sources: the relevant references outside of the ECHR
* conclusion: the short textual description of the conclusion
* separate_opinion: the indicator if there is a separate opinion
### Data Splits
The dataset is only split into a train set.
## Dataset Creation
### Curation Rationale
This datasets provides partly corrected transcribed webcasts to enable the processing of hearings in legal NLP. No specific task is given.
### Source Data
#### Data Collection
The data was collected by transcribing the publicly available [webcasts of the ECHR](https://www.echr.coe.int/webcasts-of-hearings) with the help of [pyannote](https://huggingface.co/pyannote/speaker-diarization) and [whisper-large-v2](https://huggingface.co/openai/whisper-large-v2). The documents were sourced from the [ECHR hudoc database](https://hudoc.echr.coe.int).
#### Who are the source producers?
Participants in hearings before the ECHR for the audio and video material. Employees and judges of the ECHR for the documents.
### Annotations
#### Annotation process
**language identification** Spoken languages were manually identified by research assistants. Disagreements were discussed to achieve the final language label.
**transcript correction** All parts spoken by Judge or Judge President are corrected for the languages English and French by research assistants with a high proficiency in the respective language.
#### Personal and Sensitive Information
The dataset contains names of judges and other participants in the hearings. Due to those names being available in the public court material, we did not remove them. The machine-generated transcripts may also contain names, which were neither checked nor removed. In case of sensitive information, we rely on the provided material to provide protection (occasionally bleeping out names which should not have been mentioned in webcasts, appropriate anonymization in the documents).
## Additional Information
Download the transcripts and linked documents:
```
from datasets import load_dataset
lacour_transcripts = load_dataset("TrustHLT/LaCour", "transcripts") # default config
lacour_documents = load_dataset("TrustHLT/LaCour", "documents")
```
Formatted versions of the transcripts in .txt and .xml and more information on the collection and creation can be found on [github](https://github.com/trusthlt/lacour-corpus).
### Citation Information
Please cite this data using:
```
[tbd]
``` | TrustHLT/LaCour | [
"language_creators:found",
"language_creators:machine-generated",
"multilinguality:multilingual",
"size_categories:1K<n<10K",
"size_categories:n<1K",
"language:en",
"language:fr",
"language:ru",
"language:es",
"language:hr",
"language:it",
"language:pt",
"language:tr",
"language:pl",
"language:lt",
"language:de",
"language:uk",
"language:hu",
"language:nl",
"language:sq",
"language:ro",
"language:sr",
"license:cc-by-sa-4.0",
"legal",
"hearing",
"oral argument",
"transcript",
"echr",
"dialog",
"region:us"
]
| 2023-12-07T08:16:31+00:00 | {"language_creators": ["found", "machine-generated"], "language": ["en", "fr", "ru", "es", "hr", "it", "pt", "tr", "pl", "lt", "de", "uk", "hu", "nl", "sq", "ro", "sr"], "license": ["cc-by-sa-4.0"], "multilinguality": ["multilingual"], "size_categories": ["1K<n<10K", "n<1K"], "pretty_name": "LaCour!", "tags": ["legal", "hearing", "oral argument", "transcript", "echr", "dialog"], "dataset_info": [{"config_name": "documents", "features": [{"name": "id", "dtype": "int32"}, {"name": "webcast_id", "dtype": "string"}, {"name": "hearing_title", "dtype": "string"}, {"name": "hearing_date", "dtype": "string"}, {"name": "hearing_type", "dtype": "string"}, {"name": "application_number", "sequence": "string"}, {"name": "case_id", "dtype": "string"}, {"name": "case_name", "dtype": "string"}, {"name": "case_url", "dtype": "string"}, {"name": "ecli", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "document_date", "dtype": "string"}, {"name": "importance", "dtype": "int32"}, {"name": "articles", "sequence": "string"}, {"name": "respondent_government", "sequence": "string"}, {"name": "issue", "dtype": "string"}, {"name": "strasbourg_caselaw", "dtype": "string"}, {"name": "external_sources", "dtype": "string"}, {"name": "conclusion", "dtype": "string"}, {"name": "separate_opinion", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 1007746, "num_examples": 379}], "download_size": 1714597, "dataset_size": 1007746}, {"config_name": "transcripts", "features": [{"name": "id", "dtype": "int32"}, {"name": "webcast_id", "dtype": "string"}, {"name": "segment_id", "dtype": "int32"}, {"name": "speaker_name", "dtype": "string"}, {"name": "speaker_role", "dtype": "string"}, {"name": "data", "sequence": [{"name": "begin", "dtype": "float32"}, {"name": "end", "dtype": "float32"}, {"name": "language", "dtype": "string"}, {"name": "text", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 13950690, "num_examples": 4000}], "download_size": 5475134, "dataset_size": 13950690}], "viewer": false} | 2023-12-18T23:25:40+00:00 | []
| [
"en",
"fr",
"ru",
"es",
"hr",
"it",
"pt",
"tr",
"pl",
"lt",
"de",
"uk",
"hu",
"nl",
"sq",
"ro",
"sr"
]
| TAGS
#language_creators-found #language_creators-machine-generated #multilinguality-multilingual #size_categories-1K<n<10K #size_categories-n<1K #language-English #language-French #language-Russian #language-Spanish #language-Croatian #language-Italian #language-Portuguese #language-Turkish #language-Polish #language-Lithuanian #language-German #language-Ukrainian #language-Hungarian #language-Dutch #language-Albanian #language-Romanian #language-Serbian #license-cc-by-sa-4.0 #legal #hearing #oral argument #transcript #echr #dialog #region-us
|
## Dataset Description
- Homepage: URL
- Repository: URL
- Paper: tbd
### Dataset Summary
This dataset contains transcribed court hearings sourced from official hearings of the __European Court of Human Rights__ (URL The hearings are 154 selected webcasts (videos) from 2012-2022 in their original language (no interpretation). With manual annotation for language labels and automatic processing of the extracted audio with pyannote and whisper-large-v2, the resulting dataset contains 4000 speaker turns and 88920 individual lines. The dataset contains two subsets, the transcripts and the metadata with linked documents. The transcripts are additionally available as .txt or .xml.
### Languages
The largest amounts in the transcripts are:
English, French
A smaller portion also contains the following languages:
Russian, Spanish, Croatian, Italian, Portuguese, Turkish, Polish, Lithuanian, German, Ukrainian, Hungarian, Dutch, Albanian, Romanian, Serbian
The collected metadata is:
English
## Dataset Structure
### Data Instances
Each instance in transcripts represents an entire segment of a transcript, similar to a conversation turn in a dialog.
Each instance in documents represents a information on a document in hudoc associated with a hearing and the metadata associated with a hearing. The actual document is linked and can also be found in hudoc with the case_id. Note: 'hearing_type' states the type of the hearing, 'type' states the type of the document. If the hearing is a "Grand Chamber hearing", the "CHAMBER" document refers to a different hearing.
### Data Fields
transcripts:
* id: the identifier
* webcast_id: the identifier for the hearing
* segment_id: the identifier of the current speaker segment in the current hearing
* speaker_name: the name of the speaker (not given for Applicant, Government or Third Party)
* speaker_role: the role/party the speaker represents ('Announcer' for announcements, 'Judge' for judges, 'JudgeP' for judge president, 'Applicant' for representatives of the applicant, 'Government' for representatives of the respondent government, 'ThirdParty' for representatives of third party interveners)
* data: sequence of the following fields
* begin: the timestamp for begin of line (in seconds)
* end: the timestamp for end of line (in seconds)
* language: the language spoken (in ISO 639-1)
* text: the spoken line
documents:
* id: the identifier
* webcast_id: the identifier for the hearing (allows linking to transcripts)
* hearing_title: the title of the hearing
* hearing_date: the date of the hearing
* hearing_type: the type of hearing (Grand Chamber, Chamber or Grand Chamber Judgment Hearing)
* application_number: the application numbers which are associated with the hearing and case
* case_id: the id of the case
* case_name: the name of the case
* case_url: the direct link to the document
* ecli: the ECLI (European Case Law Identifier)
* type: the type of the document
* document_date: the date of the document
* importance: the importance score of the case (1 is the highest importance, key case)
* articles: the concerning articles of the Convention of Human Rights
* respondent_government: the code of the respondent government(s) (in ISO-3166 Alpha-3)
* issue: the references to the issue of the case
* strasbourg_caselaw: the list of cases in the ECHR which are relevant to the current case
* external_sources: the relevant references outside of the ECHR
* conclusion: the short textual description of the conclusion
* separate_opinion: the indicator if there is a separate opinion
### Data Splits
The dataset is only split into a train set.
## Dataset Creation
### Curation Rationale
This datasets provides partly corrected transcribed webcasts to enable the processing of hearings in legal NLP. No specific task is given.
### Source Data
#### Data Collection
The data was collected by transcribing the publicly available webcasts of the ECHR with the help of pyannote and whisper-large-v2. The documents were sourced from the ECHR hudoc database.
#### Who are the source producers?
Participants in hearings before the ECHR for the audio and video material. Employees and judges of the ECHR for the documents.
### Annotations
#### Annotation process
language identification Spoken languages were manually identified by research assistants. Disagreements were discussed to achieve the final language label.
transcript correction All parts spoken by Judge or Judge President are corrected for the languages English and French by research assistants with a high proficiency in the respective language.
#### Personal and Sensitive Information
The dataset contains names of judges and other participants in the hearings. Due to those names being available in the public court material, we did not remove them. The machine-generated transcripts may also contain names, which were neither checked nor removed. In case of sensitive information, we rely on the provided material to provide protection (occasionally bleeping out names which should not have been mentioned in webcasts, appropriate anonymization in the documents).
## Additional Information
Download the transcripts and linked documents:
Formatted versions of the transcripts in .txt and .xml and more information on the collection and creation can be found on github.
Please cite this data using:
| [
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: tbd",
"### Dataset Summary\n\nThis dataset contains transcribed court hearings sourced from official hearings of the __European Court of Human Rights__ (URL The hearings are 154 selected webcasts (videos) from 2012-2022 in their original language (no interpretation). With manual annotation for language labels and automatic processing of the extracted audio with pyannote and whisper-large-v2, the resulting dataset contains 4000 speaker turns and 88920 individual lines. The dataset contains two subsets, the transcripts and the metadata with linked documents. The transcripts are additionally available as .txt or .xml.",
"### Languages\n\nThe largest amounts in the transcripts are:\n\nEnglish, French\n\nA smaller portion also contains the following languages:\n\nRussian, Spanish, Croatian, Italian, Portuguese, Turkish, Polish, Lithuanian, German, Ukrainian, Hungarian, Dutch, Albanian, Romanian, Serbian\n\nThe collected metadata is:\n\nEnglish",
"## Dataset Structure",
"### Data Instances\n\nEach instance in transcripts represents an entire segment of a transcript, similar to a conversation turn in a dialog. \n\n\n\nEach instance in documents represents a information on a document in hudoc associated with a hearing and the metadata associated with a hearing. The actual document is linked and can also be found in hudoc with the case_id. Note: 'hearing_type' states the type of the hearing, 'type' states the type of the document. If the hearing is a \"Grand Chamber hearing\", the \"CHAMBER\" document refers to a different hearing.",
"### Data Fields\n\ntranscripts:\n\n* id: the identifier\n* webcast_id: the identifier for the hearing\n* segment_id: the identifier of the current speaker segment in the current hearing\n* speaker_name: the name of the speaker (not given for Applicant, Government or Third Party)\n* speaker_role: the role/party the speaker represents ('Announcer' for announcements, 'Judge' for judges, 'JudgeP' for judge president, 'Applicant' for representatives of the applicant, 'Government' for representatives of the respondent government, 'ThirdParty' for representatives of third party interveners)\n* data: sequence of the following fields\n * begin: the timestamp for begin of line (in seconds)\n * end: the timestamp for end of line (in seconds)\n * language: the language spoken (in ISO 639-1)\n * text: the spoken line\n\ndocuments:\n\n* id: the identifier\n* webcast_id: the identifier for the hearing (allows linking to transcripts)\n* hearing_title: the title of the hearing\n* hearing_date: the date of the hearing\n* hearing_type: the type of hearing (Grand Chamber, Chamber or Grand Chamber Judgment Hearing)\n* application_number: the application numbers which are associated with the hearing and case\n* case_id: the id of the case\n* case_name: the name of the case \n* case_url: the direct link to the document\n* ecli: the ECLI (European Case Law Identifier)\n* type: the type of the document\n* document_date: the date of the document\n* importance: the importance score of the case (1 is the highest importance, key case)\n* articles: the concerning articles of the Convention of Human Rights\n* respondent_government: the code of the respondent government(s) (in ISO-3166 Alpha-3)\n* issue: the references to the issue of the case\n* strasbourg_caselaw: the list of cases in the ECHR which are relevant to the current case\n* external_sources: the relevant references outside of the ECHR\n* conclusion: the short textual description of the conclusion\n* separate_opinion: the indicator if there is a separate opinion",
"### Data Splits\n\nThe dataset is only split into a train set.",
"## Dataset Creation",
"### Curation Rationale\n\nThis datasets provides partly corrected transcribed webcasts to enable the processing of hearings in legal NLP. No specific task is given.",
"### Source Data",
"#### Data Collection\n\nThe data was collected by transcribing the publicly available webcasts of the ECHR with the help of pyannote and whisper-large-v2. The documents were sourced from the ECHR hudoc database.",
"#### Who are the source producers?\n\nParticipants in hearings before the ECHR for the audio and video material. Employees and judges of the ECHR for the documents.",
"### Annotations",
"#### Annotation process\n\nlanguage identification Spoken languages were manually identified by research assistants. Disagreements were discussed to achieve the final language label.\n\ntranscript correction All parts spoken by Judge or Judge President are corrected for the languages English and French by research assistants with a high proficiency in the respective language.",
"#### Personal and Sensitive Information\n\nThe dataset contains names of judges and other participants in the hearings. Due to those names being available in the public court material, we did not remove them. The machine-generated transcripts may also contain names, which were neither checked nor removed. In case of sensitive information, we rely on the provided material to provide protection (occasionally bleeping out names which should not have been mentioned in webcasts, appropriate anonymization in the documents).",
"## Additional Information\n\nDownload the transcripts and linked documents:\n\n\n\nFormatted versions of the transcripts in .txt and .xml and more information on the collection and creation can be found on github.\n\n\n\n\nPlease cite this data using:"
]
| [
"TAGS\n#language_creators-found #language_creators-machine-generated #multilinguality-multilingual #size_categories-1K<n<10K #size_categories-n<1K #language-English #language-French #language-Russian #language-Spanish #language-Croatian #language-Italian #language-Portuguese #language-Turkish #language-Polish #language-Lithuanian #language-German #language-Ukrainian #language-Hungarian #language-Dutch #language-Albanian #language-Romanian #language-Serbian #license-cc-by-sa-4.0 #legal #hearing #oral argument #transcript #echr #dialog #region-us \n",
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: tbd",
"### Dataset Summary\n\nThis dataset contains transcribed court hearings sourced from official hearings of the __European Court of Human Rights__ (URL The hearings are 154 selected webcasts (videos) from 2012-2022 in their original language (no interpretation). With manual annotation for language labels and automatic processing of the extracted audio with pyannote and whisper-large-v2, the resulting dataset contains 4000 speaker turns and 88920 individual lines. The dataset contains two subsets, the transcripts and the metadata with linked documents. The transcripts are additionally available as .txt or .xml.",
"### Languages\n\nThe largest amounts in the transcripts are:\n\nEnglish, French\n\nA smaller portion also contains the following languages:\n\nRussian, Spanish, Croatian, Italian, Portuguese, Turkish, Polish, Lithuanian, German, Ukrainian, Hungarian, Dutch, Albanian, Romanian, Serbian\n\nThe collected metadata is:\n\nEnglish",
"## Dataset Structure",
"### Data Instances\n\nEach instance in transcripts represents an entire segment of a transcript, similar to a conversation turn in a dialog. \n\n\n\nEach instance in documents represents a information on a document in hudoc associated with a hearing and the metadata associated with a hearing. The actual document is linked and can also be found in hudoc with the case_id. Note: 'hearing_type' states the type of the hearing, 'type' states the type of the document. If the hearing is a \"Grand Chamber hearing\", the \"CHAMBER\" document refers to a different hearing.",
"### Data Fields\n\ntranscripts:\n\n* id: the identifier\n* webcast_id: the identifier for the hearing\n* segment_id: the identifier of the current speaker segment in the current hearing\n* speaker_name: the name of the speaker (not given for Applicant, Government or Third Party)\n* speaker_role: the role/party the speaker represents ('Announcer' for announcements, 'Judge' for judges, 'JudgeP' for judge president, 'Applicant' for representatives of the applicant, 'Government' for representatives of the respondent government, 'ThirdParty' for representatives of third party interveners)\n* data: sequence of the following fields\n * begin: the timestamp for begin of line (in seconds)\n * end: the timestamp for end of line (in seconds)\n * language: the language spoken (in ISO 639-1)\n * text: the spoken line\n\ndocuments:\n\n* id: the identifier\n* webcast_id: the identifier for the hearing (allows linking to transcripts)\n* hearing_title: the title of the hearing\n* hearing_date: the date of the hearing\n* hearing_type: the type of hearing (Grand Chamber, Chamber or Grand Chamber Judgment Hearing)\n* application_number: the application numbers which are associated with the hearing and case\n* case_id: the id of the case\n* case_name: the name of the case \n* case_url: the direct link to the document\n* ecli: the ECLI (European Case Law Identifier)\n* type: the type of the document\n* document_date: the date of the document\n* importance: the importance score of the case (1 is the highest importance, key case)\n* articles: the concerning articles of the Convention of Human Rights\n* respondent_government: the code of the respondent government(s) (in ISO-3166 Alpha-3)\n* issue: the references to the issue of the case\n* strasbourg_caselaw: the list of cases in the ECHR which are relevant to the current case\n* external_sources: the relevant references outside of the ECHR\n* conclusion: the short textual description of the conclusion\n* separate_opinion: the indicator if there is a separate opinion",
"### Data Splits\n\nThe dataset is only split into a train set.",
"## Dataset Creation",
"### Curation Rationale\n\nThis datasets provides partly corrected transcribed webcasts to enable the processing of hearings in legal NLP. No specific task is given.",
"### Source Data",
"#### Data Collection\n\nThe data was collected by transcribing the publicly available webcasts of the ECHR with the help of pyannote and whisper-large-v2. The documents were sourced from the ECHR hudoc database.",
"#### Who are the source producers?\n\nParticipants in hearings before the ECHR for the audio and video material. Employees and judges of the ECHR for the documents.",
"### Annotations",
"#### Annotation process\n\nlanguage identification Spoken languages were manually identified by research assistants. Disagreements were discussed to achieve the final language label.\n\ntranscript correction All parts spoken by Judge or Judge President are corrected for the languages English and French by research assistants with a high proficiency in the respective language.",
"#### Personal and Sensitive Information\n\nThe dataset contains names of judges and other participants in the hearings. Due to those names being available in the public court material, we did not remove them. The machine-generated transcripts may also contain names, which were neither checked nor removed. In case of sensitive information, we rely on the provided material to provide protection (occasionally bleeping out names which should not have been mentioned in webcasts, appropriate anonymization in the documents).",
"## Additional Information\n\nDownload the transcripts and linked documents:\n\n\n\nFormatted versions of the transcripts in .txt and .xml and more information on the collection and creation can be found on github.\n\n\n\n\nPlease cite this data using:"
]
|
83bceb37fbfd226e3c2406c1d7a71bdb2801f23c | # Dataset Card for "quirky_sciq_pythia-410m"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | atmallen/quirky_sciq_pythia-410m | [
"region:us"
]
| 2023-12-07T08:18:54+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "label", "dtype": "int64"}, {"name": "difficulty", "dtype": "float64"}, {"name": "statement", "dtype": "string"}, {"name": "character", "dtype": "string"}, {"name": "alice_label", "dtype": "bool"}, {"name": "bob_label", "dtype": "bool"}, {"name": "bob_log_odds", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 29103976, "num_examples": 46716}, {"name": "validation", "num_bytes": 2464470, "num_examples": 4000}, {"name": "test", "num_bytes": 2510666, "num_examples": 4000}], "download_size": 7307630, "dataset_size": 34079112}} | 2023-12-12T06:54:01+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "quirky_sciq_pythia-410m"
More Information needed | [
"# Dataset Card for \"quirky_sciq_pythia-410m\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"quirky_sciq_pythia-410m\"\n\nMore Information needed"
]
|
067d9b87ee4f3c39f8dc2cf524a073ee6e0df514 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | SanaFaraz/history | [
"region:us"
]
| 2023-12-07T08:20:45+00:00 | {} | 2023-12-07T10:31:16+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
]
|
c919f35b4131a1d2fd20535e7e034168ea293e31 |
# Dataset of Ahagon Umiko
This is the dataset of Ahagon Umiko, containing 223 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 223 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 519 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 561 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 223 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 223 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 223 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 519 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 519 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 459 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 561 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 561 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| CyberHarem/ahagon_umiko_newgame | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
]
| 2023-12-07T08:23:31+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-07T08:32:56+00:00 | []
| []
| TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of Ahagon Umiko
=======================
This is the dataset of Ahagon Umiko, containing 223 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
| []
| [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
]
|
555e0e55ab64ede628231dfbb3ae975809700a6d |
# jaqket_v1_qa_wikija_context
[AI王 公式配布データセット(JAQKET)](https://sites.google.com/view/project-aio/dataset?authuser=0)へ、Wikipedia のコンテキストを追加したデータセットです。
JAQKET で公開されており、かつライセンスが[CC BY-SA 4.0 DEED](https://creativecommons.org/licenses/by-sa/4.0/deed.ja)のデータをもとに、質問の回答が含まれる Wikipedia の passage をコンテキスト追加し HuggingFace Dataset を作成した物です。Q&A タスクにコンテキスト入れて学習させたいときや、質問 + コンテキストから回答をきちんと抽出できるのか(RAGのreader用途)、といったタスクの利用が想定されます。
データセットの context には、wikipedia の関連 passage の Top-3 (質問を特徴文ベクトルへ変換したもの類似passage Top-3)が格納されていて、このTop-3のうちのどこかの文字列に answer が含まれています。Top-3にanswerが含まれなかったデータは、取り除いています。データはtrain 2939件、validation 980件に split しています。
wikipedia passage には [singletongue/wikipedia-utils の passages-c400-jawiki-20230403](https://huggingface.co/datasets/singletongue/wikipedia-utils/viewer/passages-c400-jawiki-20230403)を、質問から関連する passage の取得には [hotchpotch/wikipedia-passages-jawiki-embeddings](https://huggingface.co/datasets/hotchpotch/wikipedia-passages-jawiki-embeddings)の `multilingual-e5-large-query` を使って類似ベクトル検索し抽出しています。
## データセットの作成スクリプト
- https://huggingface.co/datasets/hotchpotch/jaqket_v1_qa_wikija_context/blob/main/create_dataset.ipynb
## ライセンス
- [CC BY-SA 4.0 DEED](https://creativecommons.org/licenses/by-sa/4.0/deed.ja)
| hotchpotch/jaqket_v1_qa_wikija_context | [
"task_categories:question-answering",
"language:ja",
"license:cc-by-sa-4.0",
"region:us"
]
| 2023-12-07T08:36:30+00:00 | {"language": ["ja"], "license": "cc-by-sa-4.0", "task_categories": ["question-answering"], "dataset_info": {"features": [{"name": "qid", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "context", "sequence": "string"}, {"name": "answers", "sequence": "string"}, {"name": "competition", "dtype": "string"}, {"name": "timestamp", "dtype": "string"}, {"name": "section", "dtype": "string"}, {"name": "number", "dtype": "string"}, {"name": "original_question", "dtype": "string"}, {"name": "original_answer", "dtype": "string"}, {"name": "original_additional_info", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7981391, "num_examples": 2939}, {"name": "validation", "num_bytes": 2671680, "num_examples": 980}], "download_size": 6275956, "dataset_size": 10653071}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2023-12-10T23:26:35+00:00 | []
| [
"ja"
]
| TAGS
#task_categories-question-answering #language-Japanese #license-cc-by-sa-4.0 #region-us
|
# jaqket_v1_qa_wikija_context
AI王 公式配布データセット(JAQKET)へ、Wikipedia のコンテキストを追加したデータセットです。
JAQKET で公開されており、かつライセンスがCC BY-SA 4.0 DEEDのデータをもとに、質問の回答が含まれる Wikipedia の passage をコンテキスト追加し HuggingFace Dataset を作成した物です。Q&A タスクにコンテキスト入れて学習させたいときや、質問 + コンテキストから回答をきちんと抽出できるのか(RAGのreader用途)、といったタスクの利用が想定されます。
データセットの context には、wikipedia の関連 passage の Top-3 (質問を特徴文ベクトルへ変換したもの類似passage Top-3)が格納されていて、このTop-3のうちのどこかの文字列に answer が含まれています。Top-3にanswerが含まれなかったデータは、取り除いています。データはtrain 2939件、validation 980件に split しています。
wikipedia passage には singletongue/wikipedia-utils の passages-c400-jawiki-20230403を、質問から関連する passage の取得には hotchpotch/wikipedia-passages-jawiki-embeddingsの 'multilingual-e5-large-query' を使って類似ベクトル検索し抽出しています。
## データセットの作成スクリプト
- URL
## ライセンス
- CC BY-SA 4.0 DEED
| [
"# jaqket_v1_qa_wikija_context\n\nAI王 公式配布データセット(JAQKET)へ、Wikipedia のコンテキストを追加したデータセットです。\n\nJAQKET で公開されており、かつライセンスがCC BY-SA 4.0 DEEDのデータをもとに、質問の回答が含まれる Wikipedia の passage をコンテキスト追加し HuggingFace Dataset を作成した物です。Q&A タスクにコンテキスト入れて学習させたいときや、質問 + コンテキストから回答をきちんと抽出できるのか(RAGのreader用途)、といったタスクの利用が想定されます。\n\nデータセットの context には、wikipedia の関連 passage の Top-3 (質問を特徴文ベクトルへ変換したもの類似passage Top-3)が格納されていて、このTop-3のうちのどこかの文字列に answer が含まれています。Top-3にanswerが含まれなかったデータは、取り除いています。データはtrain 2939件、validation 980件に split しています。\n\nwikipedia passage には singletongue/wikipedia-utils の passages-c400-jawiki-20230403を、質問から関連する passage の取得には hotchpotch/wikipedia-passages-jawiki-embeddingsの 'multilingual-e5-large-query' を使って類似ベクトル検索し抽出しています。",
"## データセットの作成スクリプト\n\n- URL",
"## ライセンス\n\n- CC BY-SA 4.0 DEED"
]
| [
"TAGS\n#task_categories-question-answering #language-Japanese #license-cc-by-sa-4.0 #region-us \n",
"# jaqket_v1_qa_wikija_context\n\nAI王 公式配布データセット(JAQKET)へ、Wikipedia のコンテキストを追加したデータセットです。\n\nJAQKET で公開されており、かつライセンスがCC BY-SA 4.0 DEEDのデータをもとに、質問の回答が含まれる Wikipedia の passage をコンテキスト追加し HuggingFace Dataset を作成した物です。Q&A タスクにコンテキスト入れて学習させたいときや、質問 + コンテキストから回答をきちんと抽出できるのか(RAGのreader用途)、といったタスクの利用が想定されます。\n\nデータセットの context には、wikipedia の関連 passage の Top-3 (質問を特徴文ベクトルへ変換したもの類似passage Top-3)が格納されていて、このTop-3のうちのどこかの文字列に answer が含まれています。Top-3にanswerが含まれなかったデータは、取り除いています。データはtrain 2939件、validation 980件に split しています。\n\nwikipedia passage には singletongue/wikipedia-utils の passages-c400-jawiki-20230403を、質問から関連する passage の取得には hotchpotch/wikipedia-passages-jawiki-embeddingsの 'multilingual-e5-large-query' を使って類似ベクトル検索し抽出しています。",
"## データセットの作成スクリプト\n\n- URL",
"## ライセンス\n\n- CC BY-SA 4.0 DEED"
]
|
d44e83ba65cd7e1110bc7cfad36b8787a303f3bb | # Dataset Card for "kor_boolq"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
@inproceedings{clark2019boolq,
title = {BoolQ: Exploring the Surprising Difficulty of Natural Yes/No Questions},
author = {Clark, Christopher and Lee, Kenton and Chang, Ming-Wei, and Kwiatkowski, Tom and Collins, Michael, and Toutanova, Kristina},
booktitle = {NAACL},
year = {2019},
}
``` | KETI-AIR/kor_boolq | [
"license:cc-by-sa-3.0",
"region:us"
]
| 2023-12-07T08:44:28+00:00 | {"license": "cc-by-sa-3.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "question", "dtype": "string"}, {"name": "passage", "dtype": "string"}, {"name": "answer", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 6533771, "num_examples": 9427}, {"name": "validation", "num_bytes": 2239144, "num_examples": 3270}], "download_size": 5399244, "dataset_size": 8772915}} | 2023-12-07T08:45:59+00:00 | []
| []
| TAGS
#license-cc-by-sa-3.0 #region-us
| # Dataset Card for "kor_boolq"
More Information needed
# Source Data Citation Information
| [
"# Dataset Card for \"kor_boolq\"\n\nMore Information needed",
"# Source Data Citation Information"
]
| [
"TAGS\n#license-cc-by-sa-3.0 #region-us \n",
"# Dataset Card for \"kor_boolq\"\n\nMore Information needed",
"# Source Data Citation Information"
]
|
fb1a49eebc8c75a3a7ea446c864a8166466c921a | [https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EWH-bnNdqkC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EWH-bnNdqkC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ewmDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ewmDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ewPAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ewPAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ewQEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ewQEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eWrDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eWrDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EWwBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EWwBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ewXAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ewXAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eX7DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eX7DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1exLDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1exLDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ExTDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ExTDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eXzAQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eXzAQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eYbBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eYbBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eYDAAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eYDAAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eyMPWQjcCsC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eyMPWQjcCsC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eYqCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eYqCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eyYAAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eyYAAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eyyDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eyyDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eZ1DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eZ1DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eZ8TjQFcV8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1eZ8TjQFcV8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EZgBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EZgBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EZho2PBjsgC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EZho2PBjsgC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ezjAAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ezjAAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EzKN-NJRqEC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EzKN-NJRqEC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EZXAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EZXAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EZyDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1EZyDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f_BR2DulRIC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f_BR2DulRIC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f_RoDUKydgC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f_RoDUKydgC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-F019Q8dtAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-F019Q8dtAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F0hQbKmEIcC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F0hQbKmEIcC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f0QBwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f0QBwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F1DPwNcA8oC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F1DPwNcA8oC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F1oDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F1oDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f2bBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f2bBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f2ICgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f2ICgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F2rDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F2rDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f2RDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f2RDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f2TDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f2TDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F3BBwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F3BBwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F3yCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F3yCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1f4CwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1f4CwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f4mBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f4mBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F4ODQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F4ODQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F4TNyZjddkC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F4TNyZjddkC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f4xAAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f4xAAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F53CwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F53CwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f53kihTyRMC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f53kihTyRMC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F5-DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F5-DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f5ZDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f5ZDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1F6DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1F6DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f6rDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f6rDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f6SBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f6SBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f6tDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f6tDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F7dVvaKsicC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F7dVvaKsicC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f85EAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f85EAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F8BEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F8BEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F8hCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F8hCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F8UEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F8UEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F96uHJfJHIC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F96uHJfJHIC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f98yWQLwAIC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f98yWQLwAIC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F9c3QDjKv8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F9c3QDjKv8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F9fBwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1F9fBwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f9KDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f9KDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f9nBwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f9nBwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fAMST5p8OEC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fAMST5p8OEC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fb8CwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fb8CwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fb9DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fb9DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FBf5BbU-uUC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FBf5BbU-uUC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fBGDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fBGDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FBjCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FBjCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fBSDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fBSDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fBXAQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fBXAQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fBxDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fBxDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FCbBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FCbBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FCM9yxtoyEC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FCM9yxtoyEC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fCvyRFs87gC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fCvyRFs87gC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FdHBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FdHBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fDNDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fDNDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fduAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fduAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fe3DgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fe3DgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FE4DAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FE4DAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fefBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fefBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fenDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fenDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fEoBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fEoBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fEWEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fEWEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FFaDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FFaDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fFhAAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fFhAAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fFHDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fFHDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FFhzgtu-P8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FFhzgtu-P8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ffnp1tkN2gC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ffnp1tkN2gC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FfQDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FfQDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FFrDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FFrDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fFsj9eVk5QC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fFsj9eVk5QC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fg1EAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fg1EAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fGaDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fGaDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FGbPfdiT6YC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FGbPfdiT6YC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FgoDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FgoDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fGXBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fGXBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FgYckfdj1MC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FgYckfdj1MC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fh1DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fh1DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FhaDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FhaDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1FHAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1FHAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1FhDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1FhDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fhJEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fhJEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fHRCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fHRCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FhUDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FhUDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fhwF52CNd8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fhwF52CNd8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fIBAAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fIBAAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fi-CAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fi-CAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1FiDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1FiDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fivAAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fivAAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fIVD8Hf-e4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fIVD8Hf-e4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fIZAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fIZAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FJ7GOb_ngwC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FJ7GOb_ngwC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fJBBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fJBBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FJCFQP1x7EC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FJCFQP1x7EC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FJEEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FJEEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FJGEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FJGEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FJHDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FJHDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FjVBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FjVBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fk7DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fk7DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FKeDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FKeDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FkiDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FkiDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FkKBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FkKBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fKORyxDUtEC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fKORyxDUtEC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fkPAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fkPAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fKtAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fKtAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f-lAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1f-lAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FlnDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FlnDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1flSDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1flSDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FMaCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FMaCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fMlCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fMlCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FMqAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FMqAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fN5AgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fN5AgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fN9ynhG-tkC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fN9ynhG-tkC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FNaDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FNaDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FnbDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FnbDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1fNDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1fNDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fnEDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fnEDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FniAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FniAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FnJBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FnJBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FnTLtFPcU4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FnTLtFPcU4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fo8DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fo8DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FoEEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FoEEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fp_AQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fp_AQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fp0DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fp0DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fp1DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fp1DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fp6DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fp6DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fpdDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fpdDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FpFlWcr9DwC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FpFlWcr9DwC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FPHBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FPHBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fpmDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fpmDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FpnDOtIB9sC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FpnDOtIB9sC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FqCCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FqCCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fq-DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fq-DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1fQDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1fQDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1FQILRGEo0C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1FQILRGEo0C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fqsd-v4LcwC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fqsd-v4LcwC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fqUAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fqUAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fRADwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fRADwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FRBDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FRBDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-FrCQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-FrCQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FRFxRlNtOIC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FRFxRlNtOIC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1frKDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1frKDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1frQAAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1frQAAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fRtAAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fRtAAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fRtAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fRtAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FRYCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FRYCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fs3DAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fs3DAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fs8BAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fs8BAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-FSDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-FSDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FsIEWLys_sC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FsIEWLys_sC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FSxhKkHHe4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FSxhKkHHe4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Ft2DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Ft2DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Ft4DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Ft4DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fU4DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fU4DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fu7AAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fu7AAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fUaAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fUaAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FUCCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FUCCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fusDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fusDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fUtDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fUtDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fv5DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fv5DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fVADwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fVADwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fvhBwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fvhBwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FVLv7WS4C4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FVLv7WS4C4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FvW7ywmrPMC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FvW7ywmrPMC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fWkDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fWkDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FwkDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FwkDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FW-R-ePvwMC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FW-R-ePvwMC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FXBDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FXBDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FXRBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FXRBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fxzAzNE8wIC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fxzAzNE8wIC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FXzb8hwrEsC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FXzb8hwrEsC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fXZBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fXZBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FxzDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FxzDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FY_EAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FY_EAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fy8DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Fy8DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fY9DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fY9DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fyIBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fyIBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fynBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fynBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fyXCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fyXCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FyYDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FyYDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FzPDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1FzPDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fZyDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1fZyDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g_cDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g_cDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g_jDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g_jDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g_Z6YpHqlEC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g_Z6YpHqlEC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1g0EAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1g0EAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G15CgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G15CgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g1mDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g1mDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G1MHj2Wg1kC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G1MHj2Wg1kC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G1R5J6pTtUC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G1R5J6pTtUC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G2CDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G2CDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G2kHjIg9JMC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G2kHjIg9JMC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G3NRy4_8T4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G3NRy4_8T4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G3oCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G3oCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G4FlXMOgfkC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G4FlXMOgfkC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G5ODAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G5ODAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G5RDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G5RDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G5uDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G5uDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G63CgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G63CgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-G6DAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-G6DAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g6GDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g6GDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G6gDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G6gDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G6oBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G6oBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G6tCAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G6tCAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G70DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G70DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g7RAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g7RAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g8FEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g8FEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g8LAYGyUGUC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g8LAYGyUGUC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g8nCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g8nCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G8stZAS3cAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G8stZAS3cAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g93DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g93DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g9MCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g9MCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G9ODwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G9ODwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GA6DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GA6DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gADDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gADDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GAMCAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GAMCAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gaqDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gaqDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GAtDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GAtDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gaUDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gaUDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GaZBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GaZBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GB4BwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GB4BwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gB6MAiNKEkC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gB6MAiNKEkC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gBlpEdwYCUC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gBlpEdwYCUC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GbmBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GbmBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GBoAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GBoAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GBoDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GBoDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GbQCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GbQCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GbTDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GbTDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gbu5Dc3wmAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gbu5Dc3wmAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gBWDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gBWDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gcaG_xk3woC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gcaG_xk3woC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gCCBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gCCBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G-CDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1G-CDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GclDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GclDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gCMDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gCMDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gcYCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gcYCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gDElomlm3kC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gDElomlm3kC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gdNBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gdNBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GdQDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GdQDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gdyBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gdyBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GE6YbntJrQC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GE6YbntJrQC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gE8Rk_RI_8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gE8Rk_RI_8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GeDDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GeDDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GeHO2uy1XQC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GeHO2uy1XQC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GEoDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GEoDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GETAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GETAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GEWAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GEWAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GF3aYLH3z4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GF3aYLH3z4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gF7DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gF7DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GFd3vMxpk4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GFd3vMxpk4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gFiDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gFiDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GfMBqueWZwC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GfMBqueWZwC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gfMDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gfMDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-ggDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-ggDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gGFCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gGFCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GgIDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GgIDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gGODgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gGODgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gguCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gguCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ggxCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ggxCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gH6DQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gH6DQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ghAEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ghAEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1GHCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1GHCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-GHHrwodBsC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-GHHrwodBsC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gHMCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gHMCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GhoDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GhoDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gJ2CgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gJ2CgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GJdyuj8yX0C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GJdyuj8yX0C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gjFa-Nl4FoC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gjFa-Nl4FoC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GJgBwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GJgBwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GjIBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GjIBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GJjDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GJjDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GjqAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GjqAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gjVBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gjVBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gJXDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gJXDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GK5DAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GK5DAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GKfDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GKfDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GkMNv7PM68C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GkMNv7PM68C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GKsDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GKsDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gkTBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gkTBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GKuDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GKuDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gKUpITuf00C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gKUpITuf00C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GlaDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GlaDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GLwDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GLwDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gmaSxQmzbkC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gmaSxQmzbkC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GmpCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GmpCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gMQaX65aucC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gMQaX65aucC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gN9BAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gN9BAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gNCDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gNCDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GNGDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GNGDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gNODQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gNODQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GnRCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GnRCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GnTDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GnTDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GOQMnnrcUMC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GOQMnnrcUMC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gPADgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gPADgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gpBCfzy9p8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gpBCfzy9p8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g-pBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1g-pBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GpLXIfB5jcC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GpLXIfB5jcC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gpmDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gpmDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gpOBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gpOBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gppDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gppDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gpxSo0iHAoC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gpxSo0iHAoC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gPyBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gPyBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GPYDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GPYDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Gq4Ktxz2yIC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Gq4Ktxz2yIC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GQvCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GQvCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gqwDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gqwDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GR_DQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GR_DQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GrBDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GrBDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gReCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gReCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GrPIYG--i0C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GrPIYG--i0C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GsDEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GsDEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GSLGnxoygUC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GSLGnxoygUC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gsUeCcA7qMC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gsUeCcA7qMC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GtECQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GtECQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GtgBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GtgBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gthAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gthAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GtsCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GtsCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GtUSftY9y4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GtUSftY9y4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GU5BgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GU5BgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1guaqYHu_j4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1guaqYHu_j4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gueFWR7qjoC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gueFWR7qjoC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1guEHL5TAUEC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1guEHL5TAUEC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GUlLAowIT0C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GUlLAowIT0C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GunBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GunBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GUrEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GUrEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gUtCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gUtCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GuyDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GuyDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GV2DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GV2DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GV9BAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GV9BAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gV9DAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gV9DAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gvJDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gvJDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GVSAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GVSAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Gw9DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Gw9DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GWBC1mqIlwC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GWBC1mqIlwC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GWiztQvOQYC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GWiztQvOQYC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gwkDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gwkDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GWQqInAqPoC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GWQqInAqPoC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gwtDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gwtDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GwwDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GwwDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Gx8DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Gx8DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GXDBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GXDBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GXG3jkjq-wC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GXG3jkjq-wC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gxrDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gxrDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Gy_BgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Gy_BgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GyCDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GyCDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gYcukAkvnoC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gYcukAkvnoC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gyDBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gyDBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gyhRLaTCd4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gyhRLaTCd4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gYPBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gYPBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gySCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gySCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GYtDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GYtDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GyzDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GyzDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gZMDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1gZMDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GZqDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1GZqDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h03DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h03DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H0DCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H0DCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H0kEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H0kEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H10DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H10DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H11DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H11DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H1aAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H1aAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h1BDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h1BDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h1IBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h1IBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H1pbLuPG9YC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H1pbLuPG9YC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H1qDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H1qDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h1tDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h1tDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H1yCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H1yCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1H2CwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1H2CwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H2gDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H2gDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h2nCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h2nCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h2qDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h2qDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h3yDvDlU1oC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h3yDvDlU1oC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H4_EAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H4_EAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h4HCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h4HCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h4KDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h4KDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h4xDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h4xDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h4yDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h4yDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H5eKoyjZKwC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H5eKoyjZKwC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H5GEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H5GEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h5iDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h5iDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H5Tc8gcmZIC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H5Tc8gcmZIC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H6DDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H6DDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h71O9rpCIcC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h71O9rpCIcC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H7DBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H7DBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h7vHLEBudcC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h7vHLEBudcC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h83DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h83DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h8EAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h8EAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H8GCAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H8GCAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h8LBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h8LBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h8NE0iud1IC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h8NE0iud1IC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h8sEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h8sEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h98XfHJ2OAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1h98XfHJ2OAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H9coaAWKT8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H9coaAWKT8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H9yDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H9yDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hA1qOTY4D8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hA1qOTY4D8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HaCDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HaCDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hAhDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hAhDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hAPhpgsSQsC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hAPhpgsSQsC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HAzCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HAzCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hbAavG-aLEC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hbAavG-aLEC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hBADwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hBADwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hBGDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hBGDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HBhDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HBhDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HbHDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HbHDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HBoGKaP5msC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HBoGKaP5msC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HbpaOBEpvkC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HbpaOBEpvkC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HbPDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HbPDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HBPDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HBPDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1hCDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1hCDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hcpDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hcpDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HcZEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HcZEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HdeBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HdeBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HdTDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HdTDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1He9DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1He9DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HehDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HehDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HelAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HelAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hEqDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hEqDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hFbDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hFbDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HFJEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HFJEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hfpd34NpJcC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hfpd34NpJcC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hFQDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hFQDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HfTDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HfTDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hFZDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hFZDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hGBDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hGBDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HGcDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HGcDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hgwDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hgwDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hgYBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hgYBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HH-DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HH-DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1hHEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1hHEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hheDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hheDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HhGDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HhGDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hHODQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hHODQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hhPDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hhPDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hhVDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hhVDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hI-BXDyAMYC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hI-BXDyAMYC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hIFm8jAmlgC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hIFm8jAmlgC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HiJCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HiJCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hInEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hInEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HisDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HisDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HIuQNl72_EC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HIuQNl72_EC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HjcDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HjcDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HjpBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HjpBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HjpDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HjpDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hJTDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hJTDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HJvAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HJvAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HjvBwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HjvBwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HjWCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HjWCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hKLDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hKLDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HKLk91MRBUC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HKLk91MRBUC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hL0BQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hL0BQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hlADwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hlADwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hLGAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hLGAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hlTBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hlTBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HlXRR6EBwwC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HlXRR6EBwwC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hMEAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hMEAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HmGDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HmGDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HmIDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HmIDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hMJGEJTIqYC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hMJGEJTIqYC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hMnDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hMnDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hMyBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hMyBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hN_BwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hN_BwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hn_DAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hn_DAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HN4DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HN4DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HN7CgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HN7CgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HNgQ2mSjicC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HNgQ2mSjicC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ho3CwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ho3CwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hoABQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hoABQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HOgCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HOgCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hOICgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hOICgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hOjvwDHRVUC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hOjvwDHRVUC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hORAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hORAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1houEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1houEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Hp7DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Hp7DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HpfBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HpfBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hpFhxhBLngC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hpFhxhBLngC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HpyDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HpyDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hQgBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hQgBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hqOAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hqOAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-HRBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-HRBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hRcDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hRcDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hRHBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hRHBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HrszHmRYJcC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HrszHmRYJcC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hS0BAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hS0BAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hs1GWwnx-wC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hs1GWwnx-wC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HSaRn1itAoC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HSaRn1itAoC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hSnDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hSnDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hsoCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hsoCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HsVNy6Y2dwC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HsVNy6Y2dwC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hSXYdDCrG4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hSXYdDCrG4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hsZEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hsZEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HTrAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HTrAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HU5DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HU5DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1huQN-7MfQAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1huQN-7MfQAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HwO3EBoKaQC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HwO3EBoKaQC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hwQBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hwQBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hwXykRIogQC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hwXykRIogQC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Hx4CAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Hx4CAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hXfCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hXfCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hXKAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hXKAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HxSDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HxSDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hxUBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hxUBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HxyDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HxyDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HxzueGIY5QC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HxzueGIY5QC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H-YBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1H-YBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hYmDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hYmDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hYqDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hYqDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HyyVeePtdIC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HyyVeePtdIC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hyzCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hyzCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HzFCQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HzFCQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HzICgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HzICgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HZIogBYr78C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HZIogBYr78C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hZjDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1hZjDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HzLDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1HzLDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i_ABQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i_ABQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i_HePjV4bIC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i_HePjV4bIC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I_PDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I_PDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I0gEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I0gEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I0HEqdrX8kC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I0HEqdrX8kC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i0Vysk4saQC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i0Vysk4saQC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I1xDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I1xDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I20Jhs-yNQC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I20Jhs-yNQC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i22BgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i22BgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i22DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i22DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i2dCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i2dCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i2gCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i2gCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I2pBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I2pBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i4EWAG7j24C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i4EWAG7j24C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I57uPGVY04C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I57uPGVY04C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i59DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i59DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I5i7E_pLpgC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I5i7E_pLpgC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I5oDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I5oDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I62AgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I62AgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I65AwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I65AwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i6aDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i6aDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I6HbvLNfhIC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I6HbvLNfhIC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I6iK79f5d8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I6iK79f5d8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i7eCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i7eCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i8ZCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i8ZCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i9LEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i9LEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I9mBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I9mBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I9ruEpcPd8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1I9ruEpcPd8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iAhAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iAhAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IArDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IArDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IAXBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IAXBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IaXDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IaXDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iB3PF6Pa8kC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iB3PF6Pa8kC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IBjDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IBjDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IBRODEFd40C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IBRODEFd40C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IbSBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IbSBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ibYDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ibYDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iCB3EG-3UYC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iCB3EG-3UYC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1icCQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1icCQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ICeDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ICeDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ICoAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ICoAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ICwX1uWWhoC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ICwX1uWWhoC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1id_DQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1id_DQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ideAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ideAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1-IDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1-IDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ie6CAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ie6CAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iE9DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iE9DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IeeDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IeeDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iEqroaZamwC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iEqroaZamwC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iFgp4wBe6kC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iFgp4wBe6kC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IFSAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IFSAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1igDDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1igDDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iGHBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iGHBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IGJDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IGJDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IGkeU6EOIAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IGkeU6EOIAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1igREAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1igREAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IgtEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IgtEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1igUBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1igUBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iGZ2ugjiSYC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iGZ2ugjiSYC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ih9BAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ih9BAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1IHRDgn07gC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1IHRDgn07gC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i-HVKoZK7wC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1i-HVKoZK7wC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ihZDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ihZDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1II6DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1II6DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iI6EAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iI6EAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IiBDLZPv28C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IiBDLZPv28C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IiJbF8m73YC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IiJbF8m73YC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IIkBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IIkBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IiqCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IiqCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ij-CgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ij-CgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IjF7fkpDsgC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IjF7fkpDsgC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1ijGtc64RcC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1ijGtc64RcC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IJHDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IJHDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-IkDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-IkDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iKgBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iKgBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IkgEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IkgEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IkxDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IkxDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1il0BQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1il0BQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IleAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IleAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ilGNuXxXzcC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ilGNuXxXzcC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IlnDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IlnDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iluAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iluAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IlUCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IlUCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iM-gD2aChwC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iM-gD2aChwC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iMHTv4MGyAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iMHTv4MGyAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iN0DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iN0DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1INavY_UlMsC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1INavY_UlMsC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iNEAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iNEAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Innj8lGcPAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Innj8lGcPAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1inZDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1inZDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1io2DQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1io2DQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Io8DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Io8DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IODDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IODDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Io-EAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Io-EAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IoHDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IoHDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iOiCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iOiCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IokeuC4_hMC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IokeuC4_hMC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iozAAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iozAAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ip3CgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ip3CgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iPCBwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iPCBwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iPNDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iPNDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iPodSYzWoAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iPodSYzWoAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ipx1iP-gigC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ipx1iP-gigC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IqhCEVlXwMC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IqhCEVlXwMC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iQZBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iQZBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IR6MKWL8QkC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IR6MKWL8QkC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IR8DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IR8DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IrKBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IrKBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IRNNvtYUp8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IRNNvtYUp8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1irsDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1irsDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iRwDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iRwDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ISaDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ISaDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iSgnUXqPSoC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iSgnUXqPSoC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ishCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ishCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IskEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IskEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IstEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IstEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IsuDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IsuDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1It1DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1It1DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iudDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iudDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IUlHRhWXCsC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IUlHRhWXCsC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iuNNKIJWjgC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iuNNKIJWjgC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iUrDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iUrDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iuZDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iuZDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iVA4cUmBCUC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iVA4cUmBCUC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IVewDBwhAAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IVewDBwhAAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iVkDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iVkDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IvPDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IvPDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IvUDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IvUDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iVyDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iVyDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iw4DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iw4DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iWFDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iWFDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IWmCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IWmCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IWT6WCV9PIC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IWT6WCV9PIC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Iwtg1knCo0C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Iwtg1knCo0C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Ixf64lyflYC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Ixf64lyflYC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IxlBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IxlBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IXLCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IXLCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iXVCQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iXVCQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IylDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IylDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iyUAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iyUAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IYZBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IYZBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IZ4CwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IZ4CwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IZ4DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IZ4DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1IZAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1IZAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IzNDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IzNDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iZOBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iZOBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iZqAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1iZqAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1izr3Cwhtz8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1izr3Cwhtz8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1izUDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1izUDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1izVDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1izVDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IZWDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IZWDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IZzJ7lDxU4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1IZzJ7lDxU4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j_9AwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j_9AwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j_BDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j_BDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j_FDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j_FDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J_jCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J_jCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J_wDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J_wDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J0aCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J0aCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j0nCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j0nCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J0uDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J0uDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j1fBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j1fBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J1mL7ScoxkC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J1mL7ScoxkC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J1SBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J1SBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J20DAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J20DAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1j5AwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1j5AwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j64DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j64DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j6dBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j6dBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J6oDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J6oDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j6PDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j6PDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J78hiHKaPoC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J78hiHKaPoC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J7H57gFmS4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J7H57gFmS4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j7oInup9-8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j7oInup9-8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1J8Bn2-k00C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1J8Bn2-k00C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J8lDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J8lDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J8uefCJutwC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J8uefCJutwC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j8vG-xV7DkC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j8vG-xV7DkC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J8wDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1J8wDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jadDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jadDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JAmDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JAmDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jatAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jatAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JaTCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JaTCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JB8AwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JB8AwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jBJEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jBJEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jBkAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jBkAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JBoT20y0LkC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JBoT20y0LkC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JbtAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JbtAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jb-tboIelcC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jb-tboIelcC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JBxDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JBxDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jc9DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jc9DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jCGAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jCGAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JcI39Kzf1UC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JcI39Kzf1UC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JcYCJgKdGoC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JcYCJgKdGoC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jDDg9YnYrgC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jDDg9YnYrgC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jdjkCTcbBAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jdjkCTcbBAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JdsX3CJOK4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JdsX3CJOK4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Je_BAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Je_BAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jE-DdXn6rcC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jE-DdXn6rcC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JEF321oI_UC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JEF321oI_UC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jEhEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jEhEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jenBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jenBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jEOAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jEOAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JeQDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JeQDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JF2BQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JF2BQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jfFDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jfFDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jfnDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jfnDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JFqDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JFqDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jfrAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jfrAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jFuDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jFuDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jG3CgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jG3CgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jg3EAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jg3EAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JGgDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JGgDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jgI5s9bOmsC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jgI5s9bOmsC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jglDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jglDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jH0vyLy08cC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jH0vyLy08cC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jh4dkcw0nsC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jh4dkcw0nsC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jhGYTfxjSYC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jhGYTfxjSYC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jhRBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jhRBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JhvBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JhvBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JHVDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JHVDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jHZCQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jHZCQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ji2DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ji2DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jI5DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jI5DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JI7DgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JI7DgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jIfDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jIfDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JImkcg85yYC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JImkcg85yYC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JIvEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JIvEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JJADwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JJADwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JjSAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JjSAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jJuAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jJuAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JKnDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JKnDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JkQAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JkQAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jKxDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jKxDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jl_CwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jl_CwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jl7DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jl7DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JlaAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JlaAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JLFDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JLFDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jmN8FRwfiUC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jmN8FRwfiUC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jNhDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jNhDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JnJDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JnJDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jNlDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jNlDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jnvDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jnvDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jNYCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jNYCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jNyDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jNyDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JOEDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JOEDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jOlDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jOlDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jOPVVAFNeEC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jOPVVAFNeEC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jOstAAG1n4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jOstAAG1n4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jphBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jphBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jPh-EWwapMC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jPh-EWwapMC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JPNBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JPNBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JPpDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JPpDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JpRBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JpRBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jQ0AwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jQ0AwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jq1jhZxaHMC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jq1jhZxaHMC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jq3BZDahV8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jq3BZDahV8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JQfQxzXCnsC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JQfQxzXCnsC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JQpEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JQpEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JQSRLPnqBwC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JQSRLPnqBwC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jQtDQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jQtDQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JQVv-7HYKUC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JQVv-7HYKUC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jRhDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jRhDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jRJCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jRJCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JrlpzwhmV4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JrlpzwhmV4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jRQ9C9t6-MC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jRQ9C9t6-MC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JRqDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JRqDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1js2DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1js2DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JSaUBlh0GgC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JSaUBlh0GgC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JsdAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JsdAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JT4CwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JT4CwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jt4DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jt4DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jt5BgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jt5BgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jt8DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jt8DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JTFDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JTFDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jtJEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jtJEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jtNDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jtNDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jTpAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jTpAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jtqDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jtqDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JtXay9Kj90C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JtXay9Kj90C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Ju4DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Ju4DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jUbCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jUbCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j-UBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j-UBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JucDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JucDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1juCfX037TEC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1juCfX037TEC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jUDAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jUDAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jUoAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jUoAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1juuCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1juuCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1juWE6y1C1QC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1juWE6y1C1QC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jv9DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jv9DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JvJDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JvJDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jw43DhuMc8C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jw43DhuMc8C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jW5AgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jW5AgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jx5DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Jx5DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jxbCwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jxbCwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JxdDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JxdDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1jXo9hKud0C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1jXo9hKud0C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JXPDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JXPDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JXSDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JXSDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JxYAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JxYAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j-y_Lv_1MAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1j-y_Lv_1MAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jY9DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jY9DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jyu3Z9AYMQC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jyu3Z9AYMQC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JyyDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JyyDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JZ6IjsyxSoC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JZ6IjsyxSoC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jZ7CgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jZ7CgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jZFDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jZFDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JZGEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JZGEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jzHCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jzHCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JZJCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JZJCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JZO-CTRCnIC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1JZO-CTRCnIC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jzqDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jzqDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jzWDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1jzWDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k_2AwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k_2AwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K_CDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K_CDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k_zAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k_zAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K_ZCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K_ZCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K0HEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K0HEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k0qDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k0qDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k0UBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k0UBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K0xf6MpFV0C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K0xf6MpFV0C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K1EPUqg9toC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K1EPUqg9toC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k1HX2nYhjcC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k1HX2nYhjcC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K1iDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K1iDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K1UDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K1UDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k1VDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k1VDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K3MCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K3MCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K3vVZVQxXMC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K3vVZVQxXMC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K44DgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K44DgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K44EAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K44EAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k4gDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k4gDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K5cDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K5cDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K5SDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K5SDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K5XDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K5XDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K5yDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K5yDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K6CDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K6CDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k6DCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k6DCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K6GAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K6GAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k-6ksl3uLsC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k-6ksl3uLsC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k6rPh0BdxoC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k6rPh0BdxoC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K6VuErpS28C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K6VuErpS28C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K8rDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K8rDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K8tAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K8tAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k97DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k97DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k9ABAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1k9ABAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K9exKdEECEC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K9exKdEECEC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K9JDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1K9JDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KA_EAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KA_EAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kA1DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kA1DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KAbCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KAbCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KAFEAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KAFEAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KAGAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KAGAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kaXDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kaXDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KayoVl3OTMC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KayoVl3OTMC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kBgBAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kBgBAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kbOAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kbOAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kBoDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kBoDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kc5DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kc5DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Kcf1jVgk8cC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Kcf1jVgk8cC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KcnBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KcnBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kcREAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kcREAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KcSMrGudGcC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KcSMrGudGcC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kCXCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kCXCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KDcznKsfC4C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KDcznKsfC4C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KdDAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KdDAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kdTDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kdTDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KDUCgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KDUCgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kE4DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kE4DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ke5DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ke5DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1keEBwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1keEBwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KEiQA1AJI0C](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KEiQA1AJI0C)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1keTDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1keTDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kFRDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kFRDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kg23OCxMLAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kg23OCxMLAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kg8S3LuzCQC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kg8S3LuzCQC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Kh0N6Xrj0UC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Kh0N6Xrj0UC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1khADwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1khADwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KHFCQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KHFCQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kHGBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kHGBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1khHAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1khHAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1khiDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1khiDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Ki1MSGbVcsC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Ki1MSGbVcsC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ki2AwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1ki2AwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kI-DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kI-DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kIW-L5DJDwC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kIW-L5DJDwC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kiXBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kiXBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-kjBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1-kjBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KJFZ2QH03oC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KJFZ2QH03oC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kJqAwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kJqAwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KjTBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KjTBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KkFDgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KkFDgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kkTAgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kkTAgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KKwBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KKwBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kkxDAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kkxDAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1kkXhuD8eAC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=-1kkXhuD8eAC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Kl7BAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Kl7BAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1klBiZqlqOUC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1klBiZqlqOUC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kMdBgAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kMdBgAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KmoCAAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KmoCAAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KN5DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KN5DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KnGLfQXlegC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KnGLfQXlegC)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KNXBQAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KNXBQAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KNyDwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KNyDwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KO1DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1KO1DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Ko4DwAAQBAJ](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1Ko4DwAAQBAJ)
[https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kO5-kA-ztMC](https://script.google.com/macros/s/AKfycby-pLCUqD9OOlLM6VKhCDR4aOsrFyiCf7wqs7LLrUvnhLGKHlxY4ZabXnc0br2mQYGvjw/exec?id=1kO5-kA-ztMC) | jal7des/coba1 | [
"region:us"
]
| 2023-12-07T08:46:11+00:00 | {} | 2023-12-07T09:18:57+00:00 | []
| []
| TAGS
#region-us
| URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL | []
| [
"TAGS\n#region-us \n"
]
|
58a1f70f5054bf4f8012a0b6fffb02ae0a777785 |
# Dataset of Sakura Nene
This is the dataset of Sakura Nene, containing 278 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 278 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 633 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 698 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 278 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 278 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 278 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 633 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 633 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 568 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 698 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 698 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| CyberHarem/sakura_nene_newgame | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
]
| 2023-12-07T09:08:35+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-07T09:21:57+00:00 | []
| []
| TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of Sakura Nene
======================
This is the dataset of Sakura Nene, containing 278 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
| []
| [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
]
|
9bc41bdb90325322bd1ff2896b653f6ab9bc49ed | # Dataset Card for "wsd_fr_wngt_semcor_translated_aligned_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gguichard/wsd_fr_wngt_semcor_translated_aligned_v2 | [
"region:us"
]
| 2023-12-07T09:45:27+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "wn_sens", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 533318950.09168905, "num_examples": 530592}, {"name": "test", "num_bytes": 2680706.9083109708, "num_examples": 2667}], "download_size": 127035571, "dataset_size": 535999657.0}} | 2023-12-07T09:49:31+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "wsd_fr_wngt_semcor_translated_aligned_v2"
More Information needed | [
"# Dataset Card for \"wsd_fr_wngt_semcor_translated_aligned_v2\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"wsd_fr_wngt_semcor_translated_aligned_v2\"\n\nMore Information needed"
]
|
db8562658dbf7850ccf7acf8014d382cf7cc5a60 |
# Dataset of Iijima Yun
This is the dataset of Iijima Yun, containing 330 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 330 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 777 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 863 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 330 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 330 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 330 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 777 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 777 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 615 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 863 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 863 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| CyberHarem/iijima_yun_newgame | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
]
| 2023-12-07T09:58:04+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-07T10:11:17+00:00 | []
| []
| TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of Iijima Yun
=====================
This is the dataset of Iijima Yun, containing 330 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
| []
| [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
]
|
c561ccb4fed33685d36d8660331e94b7857a5cf0 |
# Dataset Card for Common Voice Corpus 15
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://commonvoice.mozilla.org/en/datasets
- **Repository:** https://github.com/common-voice/common-voice
- **Paper:** https://arxiv.org/abs/1912.06670
- **Leaderboard:** https://paperswithcode.com/dataset/common-voice
- **Point of Contact:** [Vaibhav Srivastav](mailto:[email protected])
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 28750 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 19159 validated hours in 114 languages, but more voices and languages are always added.
Take a look at the [Languages](https://commonvoice.mozilla.org/en/languages) page to request a language or start contributing.
### Supported Tasks and Leaderboards
The results for models trained on the Common Voice datasets are available via the
[🤗 Speech Bench](https://huggingface.co/spaces/huggingface/hf-speech-bench)
### Languages
```
Abkhaz, Afrikaans, Albanian, Amharic, Arabic, Armenian, Assamese, Asturian, Azerbaijani, Basaa, Bashkir, Basque, Belarusian, Bengali, Breton, Bulgarian, Cantonese, Catalan, Central Kurdish, Chinese (China), Chinese (Hong Kong), Chinese (Taiwan), Chuvash, Czech, Danish, Dhivehi, Dioula, Dutch, English, Erzya, Esperanto, Estonian, Finnish, French, Frisian, Galician, Georgian, German, Greek, Guarani, Hakha Chin, Hausa, Hebrew, Hill Mari, Hindi, Hungarian, Icelandic, Igbo, Indonesian, Interlingua, Irish, Italian, Japanese, Kabyle, Kazakh, Kinyarwanda, Korean, Kurmanji Kurdish, Kyrgyz, Lao, Latvian, Lithuanian, Luganda, Macedonian, Malayalam, Maltese, Marathi, Meadow Mari, Moksha, Mongolian, Nepali, Norwegian Nynorsk, Occitan, Odia, Pashto, Persian, Polish, Portuguese, Punjabi, Quechua Chanka, Romanian, Romansh Sursilvan, Romansh Vallader, Russian, Sakha, Santali (Ol Chiki), Saraiki, Sardinian, Serbian, Slovak, Slovenian, Sorbian, Upper, Spanish, Swahili, Swedish, Taiwanese (Minnan), Tamazight, Tamil, Tatar, Thai, Tigre, Tigrinya, Toki Pona, Turkish, Turkmen, Twi, Ukrainian, Urdu, Uyghur, Uzbek, Vietnamese, Votic, Welsh, Yoruba
```
## How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, to download the Hindi config, simply specify the corresponding language config name (i.e., "hi" for Hindi):
```python
from datasets import load_dataset
cv_15 = load_dataset("mozilla-foundation/common_voice_15_0", "hi", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
cv_15 = load_dataset("mozilla-foundation/common_voice_15_0", "hi", split="train", streaming=True)
print(next(iter(cv_14)))
```
*Bonus*: create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
### Local
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
cv_15 = load_dataset("mozilla-foundation/common_voice_15_0", "hi", split="train")
batch_sampler = BatchSampler(RandomSampler(cv_15), batch_size=32, drop_last=False)
dataloader = DataLoader(cv_15, batch_sampler=batch_sampler)
```
### Streaming
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
cv_15 = load_dataset("mozilla-foundation/common_voice_15_0", "hi", split="train")
dataloader = DataLoader(cv_15, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on Common Voice 13 with `transformers` - [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition).
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
Additional fields include `accent`, `age`, `client_id`, `up_votes`, `down_votes`, `gender`, `locale` and `segment`.
```python
{
'client_id': 'd59478fbc1ee646a28a3c652a119379939123784d99131b865a89f8b21c81f69276c48bd574b81267d9d1a77b83b43e6d475a6cfc79c232ddbca946ae9c7afc5',
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
'up_votes': 2,
'down_votes': 0,
'age': 'twenties',
'gender': 'male',
'accent': '',
'locale': 'et',
'segment': ''
}
```
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
Additional fields include `accent`, `age`, `client_id`, `up_votes`, `down_votes`, `gender`, `locale` and `segment`.
```python
{
'client_id': 'd59478fbc1ee646a28a3c652a119379939123784d99131b865a89f8b21c81f69276c48bd574b81267d9d1a77b83b43e6d475a6cfc79c232ddbca946ae9c7afc5',
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
'up_votes': 2,
'down_votes': 0,
'age': 'twenties',
'gender': 'male',
'accent': '',
'locale': 'et',
'segment': ''
}
```
### Data Fields
`client_id` (`string`): An id for which client (voice) made the recording
`path` (`string`): The path to the audio file
`audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`sentence` (`string`): The sentence the user was prompted to speak
`up_votes` (`int64`): How many upvotes the audio file has received from reviewers
`down_votes` (`int64`): How many downvotes the audio file has received from reviewers
`age` (`string`): The age of the speaker (e.g. `teens`, `twenties`, `fifties`)
`gender` (`string`): The gender of the speaker
`accent` (`string`): Accent of the speaker
`locale` (`string`): The locale of the speaker
`segment` (`string`): Usually an empty field
### Data Splits
The speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.
The validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.
The invalidated data is data has been invalidated by reviewers
and received downvotes indicating that the data is of low quality.
The reported data is data that has been reported, for different reasons.
The other data is data that has not yet been reviewed.
The dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.
## Data Preprocessing Recommended by Hugging Face
The following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice.
Many examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.
In addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, **almost all** sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.
```python
from datasets import load_dataset
ds = load_dataset("mozilla-foundation/common_voice_15_0", "en", use_auth_token=True)
def prepare_dataset(batch):
"""Function to preprocess the dataset with the .map method"""
transcription = batch["sentence"]
if transcription.startswith('"') and transcription.endswith('"'):
# we can remove trailing quotation marks as they do not affect the transcription
transcription = transcription[1:-1]
if transcription[-1] not in [".", "?", "!"]:
# append a full-stop to sentences that do not end in punctuation
transcription = transcription + "."
batch["sentence"] = transcription
return batch
ds = ds.map(prepare_dataset, desc="preprocess dataset")
```
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Public Domain, [CC-0](https://creativecommons.org/share-your-work/public-domain/cc0/)
### Citation Information
```
@inproceedings{commonvoice:2020,
author = {Ardila, R. and Branson, M. and Davis, K. and Henretty, M. and Kohler, M. and Meyer, J. and Morais, R. and Saunders, L. and Tyers, F. M. and Weber, G.},
title = {Common Voice: A Massively-Multilingual Speech Corpus},
booktitle = {Proceedings of the 12th Conference on Language Resources and Evaluation (LREC 2020)},
pages = {4211--4215},
year = 2020
}
```
| mozilla-foundation/common_voice_15_0 | [
"task_categories:automatic-speech-recognition",
"annotations_creators:crowdsourced",
"language_creators:crowdsourced",
"multilinguality:multilingual",
"source_datasets:extended|common_voice",
"license:cc0-1.0",
"arxiv:1912.06670",
"region:us"
]
| 2023-12-07T10:17:36+00:00 | {"annotations_creators": ["crowdsourced"], "language_creators": ["crowdsourced"], "license": ["cc0-1.0"], "multilinguality": ["multilingual"], "source_datasets": ["extended|common_voice"], "task_categories": ["automatic-speech-recognition"], "paperswithcode_id": "common-voice", "pretty_name": "Common Voice Corpus 15", "language_bcp47": ["ab", "af", "am", "ar", "as", "ast", "az", "ba", "bas", "be", "bg", "bn", "br", "ca", "ckb", "cnh", "cs", "cv", "cy", "da", "de", "dv", "dyu", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "fy-NL", "ga-IE", "gl", "gn", "ha", "he", "hi", "hsb", "hu", "hy-AM", "ia", "id", "ig", "is", "it", "ja", "ka", "kab", "kk", "kmr", "ko", "ky", "lg", "lo", "lt", "lv", "mdf", "mhr", "mk", "ml", "mn", "mr", "mrj", "mt", "myv", "nan-tw", "ne-NP", "nl", "nn-NO", "oc", "or", "pa-IN", "pl", "ps", "pt", "quy", "rm-sursilv", "rm-vallader", "ro", "ru", "rw", "sah", "sat", "sc", "sk", "skr", "sl", "sq", "sr", "sv-SE", "sw", "ta", "th", "ti", "tig", "tk", "tok", "tr", "tt", "tw", "ug", "uk", "ur", "uz", "vi", "vot", "yo", "yue", "zgh", "zh-CN", "zh-HK", "zh-TW"], "extra_gated_prompt": "By clicking on \u201cAccess repository\u201d below, you also agree to not attempt to determine the identity of speakers in the Common Voice dataset."} | 2023-12-07T18:20:14+00:00 | [
"1912.06670"
]
| []
| TAGS
#task_categories-automatic-speech-recognition #annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-multilingual #source_datasets-extended|common_voice #license-cc0-1.0 #arxiv-1912.06670 #region-us
|
# Dataset Card for Common Voice Corpus 15
## Table of Contents
- Dataset Description
- Dataset Summary
- Supported Tasks and Leaderboards
- Languages
- Dataset Structure
- Data Instances
- Data Fields
- Data Splits
- Dataset Creation
- Curation Rationale
- Source Data
- Annotations
- Personal and Sensitive Information
- Considerations for Using the Data
- Social Impact of Dataset
- Discussion of Biases
- Other Known Limitations
- Additional Information
- Dataset Curators
- Licensing Information
- Citation Information
- Contributions
## Dataset Description
- Homepage: URL
- Repository: URL
- Paper: URL
- Leaderboard: URL
- Point of Contact: Vaibhav Srivastav
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 28750 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 19159 validated hours in 114 languages, but more voices and languages are always added.
Take a look at the Languages page to request a language or start contributing.
### Supported Tasks and Leaderboards
The results for models trained on the Common Voice datasets are available via the
Speech Bench
### Languages
## How to use
The 'datasets' library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the 'load_dataset' function.
For example, to download the Hindi config, simply specify the corresponding language config name (i.e., "hi" for Hindi):
Using the datasets library, you can also stream the dataset on-the-fly by adding a 'streaming=True' argument to the 'load_dataset' function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
*Bonus*: create a PyTorch dataloader directly with your own datasets (local/streamed).
### Local
### Streaming
To find out more about loading and preparing audio datasets, head over to URL
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on Common Voice 13 with 'transformers' - here.
## Dataset Structure
### Data Instances
A typical data point comprises the 'path' to the audio file and its 'sentence'.
Additional fields include 'accent', 'age', 'client_id', 'up_votes', 'down_votes', 'gender', 'locale' and 'segment'.
## Dataset Structure
### Data Instances
A typical data point comprises the 'path' to the audio file and its 'sentence'.
Additional fields include 'accent', 'age', 'client_id', 'up_votes', 'down_votes', 'gender', 'locale' and 'segment'.
### Data Fields
'client_id' ('string'): An id for which client (voice) made the recording
'path' ('string'): The path to the audio file
'audio' ('dict'): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0]["audio"]' the audio file is automatically decoded and resampled to 'dataset.features["audio"].sampling_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '"audio"' column, *i.e.* 'dataset[0]["audio"]' should always be preferred over 'dataset["audio"][0]'.
'sentence' ('string'): The sentence the user was prompted to speak
'up_votes' ('int64'): How many upvotes the audio file has received from reviewers
'down_votes' ('int64'): How many downvotes the audio file has received from reviewers
'age' ('string'): The age of the speaker (e.g. 'teens', 'twenties', 'fifties')
'gender' ('string'): The gender of the speaker
'accent' ('string'): Accent of the speaker
'locale' ('string'): The locale of the speaker
'segment' ('string'): Usually an empty field
### Data Splits
The speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.
The validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.
The invalidated data is data has been invalidated by reviewers
and received downvotes indicating that the data is of low quality.
The reported data is data that has been reported, for different reasons.
The other data is data that has not yet been reviewed.
The dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.
## Data Preprocessing Recommended by Hugging Face
The following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice.
Many examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.
In addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, almost all sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
Public Domain, CC-0
| [
"# Dataset Card for Common Voice Corpus 15",
"## Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: URL\n- Leaderboard: URL\n- Point of Contact: Vaibhav Srivastav",
"### Dataset Summary\n\nThe Common Voice dataset consists of a unique MP3 and corresponding text file. \nMany of the 28750 recorded hours in the dataset also include demographic metadata like age, sex, and accent \nthat can help improve the accuracy of speech recognition engines.\n\nThe dataset currently consists of 19159 validated hours in 114 languages, but more voices and languages are always added. \nTake a look at the Languages page to request a language or start contributing.",
"### Supported Tasks and Leaderboards\n\nThe results for models trained on the Common Voice datasets are available via the \n Speech Bench",
"### Languages",
"## How to use\n\nThe 'datasets' library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the 'load_dataset' function. \n\nFor example, to download the Hindi config, simply specify the corresponding language config name (i.e., \"hi\" for Hindi):\n\n\nUsing the datasets library, you can also stream the dataset on-the-fly by adding a 'streaming=True' argument to the 'load_dataset' function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.\n\n\n*Bonus*: create a PyTorch dataloader directly with your own datasets (local/streamed).",
"### Local",
"### Streaming\n\n\n\nTo find out more about loading and preparing audio datasets, head over to URL",
"### Example scripts\n\nTrain your own CTC or Seq2Seq Automatic Speech Recognition models on Common Voice 13 with 'transformers' - here.",
"## Dataset Structure",
"### Data Instances\n\nA typical data point comprises the 'path' to the audio file and its 'sentence'. \nAdditional fields include 'accent', 'age', 'client_id', 'up_votes', 'down_votes', 'gender', 'locale' and 'segment'.",
"## Dataset Structure",
"### Data Instances\n\nA typical data point comprises the 'path' to the audio file and its 'sentence'. \nAdditional fields include 'accent', 'age', 'client_id', 'up_votes', 'down_votes', 'gender', 'locale' and 'segment'.",
"### Data Fields\n\n'client_id' ('string'): An id for which client (voice) made the recording\n\n'path' ('string'): The path to the audio file\n\n'audio' ('dict'): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0][\"audio\"]' the audio file is automatically decoded and resampled to 'dataset.features[\"audio\"].sampling_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '\"audio\"' column, *i.e.* 'dataset[0][\"audio\"]' should always be preferred over 'dataset[\"audio\"][0]'.\n\n'sentence' ('string'): The sentence the user was prompted to speak\n\n'up_votes' ('int64'): How many upvotes the audio file has received from reviewers\n\n'down_votes' ('int64'): How many downvotes the audio file has received from reviewers\n\n'age' ('string'): The age of the speaker (e.g. 'teens', 'twenties', 'fifties')\n\n'gender' ('string'): The gender of the speaker\n\n'accent' ('string'): Accent of the speaker\n\n'locale' ('string'): The locale of the speaker\n\n'segment' ('string'): Usually an empty field",
"### Data Splits\n\nThe speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.\n\nThe validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.\n\nThe invalidated data is data has been invalidated by reviewers\nand received downvotes indicating that the data is of low quality.\n\nThe reported data is data that has been reported, for different reasons.\n\nThe other data is data that has not yet been reviewed.\n\nThe dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.",
"## Data Preprocessing Recommended by Hugging Face\n\nThe following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice. \n\nMany examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.\n\nIn addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, almost all sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\nThe dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThe dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information\n\nPublic Domain, CC-0"
]
| [
"TAGS\n#task_categories-automatic-speech-recognition #annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-multilingual #source_datasets-extended|common_voice #license-cc0-1.0 #arxiv-1912.06670 #region-us \n",
"# Dataset Card for Common Voice Corpus 15",
"## Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: URL\n- Leaderboard: URL\n- Point of Contact: Vaibhav Srivastav",
"### Dataset Summary\n\nThe Common Voice dataset consists of a unique MP3 and corresponding text file. \nMany of the 28750 recorded hours in the dataset also include demographic metadata like age, sex, and accent \nthat can help improve the accuracy of speech recognition engines.\n\nThe dataset currently consists of 19159 validated hours in 114 languages, but more voices and languages are always added. \nTake a look at the Languages page to request a language or start contributing.",
"### Supported Tasks and Leaderboards\n\nThe results for models trained on the Common Voice datasets are available via the \n Speech Bench",
"### Languages",
"## How to use\n\nThe 'datasets' library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the 'load_dataset' function. \n\nFor example, to download the Hindi config, simply specify the corresponding language config name (i.e., \"hi\" for Hindi):\n\n\nUsing the datasets library, you can also stream the dataset on-the-fly by adding a 'streaming=True' argument to the 'load_dataset' function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.\n\n\n*Bonus*: create a PyTorch dataloader directly with your own datasets (local/streamed).",
"### Local",
"### Streaming\n\n\n\nTo find out more about loading and preparing audio datasets, head over to URL",
"### Example scripts\n\nTrain your own CTC or Seq2Seq Automatic Speech Recognition models on Common Voice 13 with 'transformers' - here.",
"## Dataset Structure",
"### Data Instances\n\nA typical data point comprises the 'path' to the audio file and its 'sentence'. \nAdditional fields include 'accent', 'age', 'client_id', 'up_votes', 'down_votes', 'gender', 'locale' and 'segment'.",
"## Dataset Structure",
"### Data Instances\n\nA typical data point comprises the 'path' to the audio file and its 'sentence'. \nAdditional fields include 'accent', 'age', 'client_id', 'up_votes', 'down_votes', 'gender', 'locale' and 'segment'.",
"### Data Fields\n\n'client_id' ('string'): An id for which client (voice) made the recording\n\n'path' ('string'): The path to the audio file\n\n'audio' ('dict'): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0][\"audio\"]' the audio file is automatically decoded and resampled to 'dataset.features[\"audio\"].sampling_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '\"audio\"' column, *i.e.* 'dataset[0][\"audio\"]' should always be preferred over 'dataset[\"audio\"][0]'.\n\n'sentence' ('string'): The sentence the user was prompted to speak\n\n'up_votes' ('int64'): How many upvotes the audio file has received from reviewers\n\n'down_votes' ('int64'): How many downvotes the audio file has received from reviewers\n\n'age' ('string'): The age of the speaker (e.g. 'teens', 'twenties', 'fifties')\n\n'gender' ('string'): The gender of the speaker\n\n'accent' ('string'): Accent of the speaker\n\n'locale' ('string'): The locale of the speaker\n\n'segment' ('string'): Usually an empty field",
"### Data Splits\n\nThe speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.\n\nThe validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.\n\nThe invalidated data is data has been invalidated by reviewers\nand received downvotes indicating that the data is of low quality.\n\nThe reported data is data that has been reported, for different reasons.\n\nThe other data is data that has not yet been reviewed.\n\nThe dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.",
"## Data Preprocessing Recommended by Hugging Face\n\nThe following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice. \n\nMany examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.\n\nIn addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, almost all sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\nThe dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThe dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information\n\nPublic Domain, CC-0"
]
|
db99db494c7117f43b014bd7d22f5dabd0918ee9 | # Dataset Card for "ko_wiki_sentences_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | deokhk/ko_wiki_sentences_1000 | [
"region:us"
]
| 2023-12-07T10:45:05+00:00 | {"dataset_info": {"features": [{"name": "sentence", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 178204, "num_examples": 1000}, {"name": "dev", "num_bytes": 161764, "num_examples": 1000}], "download_size": 208937, "dataset_size": 339968}} | 2023-12-07T10:45:14+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "ko_wiki_sentences_1000"
More Information needed | [
"# Dataset Card for \"ko_wiki_sentences_1000\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"ko_wiki_sentences_1000\"\n\nMore Information needed"
]
|
15ea94153756a12e405261b6593691cdf316839a |
# Dataset of Shinoda Hajime
This is the dataset of Shinoda Hajime, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 722 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 826 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 722 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 722 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 594 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 826 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 826 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| CyberHarem/shinoda_hajime_newgame | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
]
| 2023-12-07T10:48:48+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-07T10:59:39+00:00 | []
| []
| TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of Shinoda Hajime
=========================
This is the dataset of Shinoda Hajime, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
| []
| [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
]
|
c68569c0906d26fc65bd831acee8ab0d74784590 |
# Dataset of Tooyama Rin
This is the dataset of Tooyama Rin, containing 265 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 265 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 610 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 660 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 265 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 265 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 265 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 610 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 610 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 523 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 660 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 660 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| CyberHarem/tooyama_rin_newgame | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
]
| 2023-12-07T11:31:24+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-07T11:41:03+00:00 | []
| []
| TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of Tooyama Rin
======================
This is the dataset of Tooyama Rin, containing 265 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
| []
| [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
]
|
2ea4a3397ab46c9c680918e5f85b4e86ece180b8 |
Татарско-русский параллельный корпус.
```
@inproceedings{
title={Tatar parallel corpus},
author={Academy of Siences of the Recpublic of Tatarstan, Institute of Applied Semiotics.},
year={2023}
}
``` | AigizK/tatar-russian-parallel-corpora | [
"region:us"
]
| 2023-12-07T11:40:21+00:00 | {"dataset_info": {"features": [{"name": "tat", "dtype": "string"}, {"name": "rus", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 60268429, "num_examples": 161831}], "download_size": 29541597, "dataset_size": 60268429}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-07T11:49:34+00:00 | []
| []
| TAGS
#region-us
|
Татарско-русский параллельный корпус.
| []
| [
"TAGS\n#region-us \n"
]
|
6d0bad33c455ee302a2dde239980c294c7747e7f |
# Dataset of Hazuki Shizuku
This is the dataset of Hazuki Shizuku, containing 139 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 139 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 313 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 399 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 139 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 139 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 139 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 313 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 313 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 281 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 399 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 399 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| CyberHarem/hazuki_shizuku_newgame | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
]
| 2023-12-07T11:59:11+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-07T12:08:38+00:00 | []
| []
| TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of Hazuki Shizuku
=========================
This is the dataset of Hazuki Shizuku, containing 139 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
| []
| [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
]
|
ec120daccdac1443ba79b37c594b8df0ef6bbc85 | This is a test dataset | arigos/diversevul | [
"license:unknown",
"region:us"
]
| 2023-12-07T12:33:11+00:00 | {"license": "unknown", "dataset_info": {"features": [{"name": "func", "dtype": "string"}, {"name": "target", "dtype": "int64"}, {"name": "cwe", "sequence": "string"}, {"name": "project", "dtype": "string"}, {"name": "commit_id", "dtype": "string"}, {"name": "hash", "dtype": "float64"}, {"name": "size", "dtype": "int64"}, {"name": "message", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 603840744.4099404, "num_examples": 297442}, {"name": "test", "num_bytes": 33547610.295029834, "num_examples": 16525}, {"name": "valid", "num_bytes": 33547610.295029834, "num_examples": 16525}], "download_size": 308920787, "dataset_size": 670935965.0000001}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}]} | 2023-12-07T14:18:27+00:00 | []
| []
| TAGS
#license-unknown #region-us
| This is a test dataset | []
| [
"TAGS\n#license-unknown #region-us \n"
]
|
abd5f5ed5ef7c9b5edc0bd4fb62665b982fd5470 | ### Description
This dataset is an instruct style dataset comprised of a scrape of the Yahoo! Answers website that was done in 2007. The dataset is comprised of 10 categories labeled 1-10. The categories are as follows:
1. Society & Culture
2. Science & Mathematics
3. Health
4. Education & Reference
5. Computers & Internet
6. Sports
7. Business & Finance
8. Entertainment & Music
9. Family & Relationships
10. Politics & Government
The subject line and body of the question have been combined into a single field and separated by a newline character.
I would caution against using this dataset for any serious application as it contains hilariously out of date information, offensive language, and frequent spelling and grammar errors. It is, however, a charming snapshot of the internet in 2007.
**Roughly 228m llama tokens in 1.4m samples**
### Original README
>Yahoo! Answers Topic Classification Dataset
>
>Version 2, Updated 09/09/2015
>
>
>ORIGIN
>
>The original Yahoo! Answers corpus can be obtained through the Yahoo! Research Alliance Webscope program. The dataset is to be used for approved non-commercial research purposes by recipients who have signed a Data Sharing Agreement with Yahoo!. The dataset is the Yahoo! Answers corpus as of 10/25/2007. It includes all the questions and their corresponding answers. The corpus contains 4483032 questions and their answers.
>
>The Yahoo! Answers topic classification dataset is constructed by Xiang Zhang ([email protected]) from the above dataset. It is used as a text classification benchmark in the following paper: Xiang Zhang, Junbo Zhao, Yann LeCun. Character-level Convolutional Networks for Text Classification. Advances in Neural Information Processing Systems 28 (NIPS 2015).
>
>
>DESCRIPTION
>
>The Yahoo! Answers topic classification dataset is constructed using 10 largest main categories. Each class contains 140,000 training samples and 6,000 testing samples. Therefore, the total number of training samples is 1,400,000 and testing samples 60,000 in this dataset. From all the answers and other meta-information, we only used the best answer content and the main category information.
>
>The file classes.txt contains a list of classes corresponding to each label.
>
>The files train.csv and test.csv contain all the training samples as comma-sparated values. There are 4 columns in them, corresponding to class index (1 to 10), question title, question content and best answer. The text fields are escaped using double quotes ("), and any internal double quote is escaped by 2 double quotes (""). New lines are escaped by a backslash followed with an "n" character, that is "\n". | PocketDoc/Retro-YahooAnswers | [
"task_categories:question-answering",
"size_categories:1M<n<10M",
"language:en",
"not-for-all-audiences",
"alpaca",
"region:us"
]
| 2023-12-07T13:16:47+00:00 | {"language": ["en"], "size_categories": ["1M<n<10M"], "task_categories": ["question-answering"], "pretty_name": "Retro Yahoo! Answers", "tags": ["not-for-all-audiences", "alpaca"]} | 2023-12-07T14:30:02+00:00 | []
| [
"en"
]
| TAGS
#task_categories-question-answering #size_categories-1M<n<10M #language-English #not-for-all-audiences #alpaca #region-us
| ### Description
This dataset is an instruct style dataset comprised of a scrape of the Yahoo! Answers website that was done in 2007. The dataset is comprised of 10 categories labeled 1-10. The categories are as follows:
1. Society & Culture
2. Science & Mathematics
3. Health
4. Education & Reference
5. Computers & Internet
6. Sports
7. Business & Finance
8. Entertainment & Music
9. Family & Relationships
10. Politics & Government
The subject line and body of the question have been combined into a single field and separated by a newline character.
I would caution against using this dataset for any serious application as it contains hilariously out of date information, offensive language, and frequent spelling and grammar errors. It is, however, a charming snapshot of the internet in 2007.
Roughly 228m llama tokens in 1.4m samples
### Original README
>Yahoo! Answers Topic Classification Dataset
>
>Version 2, Updated 09/09/2015
>
>
>ORIGIN
>
>The original Yahoo! Answers corpus can be obtained through the Yahoo! Research Alliance Webscope program. The dataset is to be used for approved non-commercial research purposes by recipients who have signed a Data Sharing Agreement with Yahoo!. The dataset is the Yahoo! Answers corpus as of 10/25/2007. It includes all the questions and their corresponding answers. The corpus contains 4483032 questions and their answers.
>
>The Yahoo! Answers topic classification dataset is constructed by Xiang Zhang (URL@URL) from the above dataset. It is used as a text classification benchmark in the following paper: Xiang Zhang, Junbo Zhao, Yann LeCun. Character-level Convolutional Networks for Text Classification. Advances in Neural Information Processing Systems 28 (NIPS 2015).
>
>
>DESCRIPTION
>
>The Yahoo! Answers topic classification dataset is constructed using 10 largest main categories. Each class contains 140,000 training samples and 6,000 testing samples. Therefore, the total number of training samples is 1,400,000 and testing samples 60,000 in this dataset. From all the answers and other meta-information, we only used the best answer content and the main category information.
>
>The file URL contains a list of classes corresponding to each label.
>
>The files URL and URL contain all the training samples as comma-sparated values. There are 4 columns in them, corresponding to class index (1 to 10), question title, question content and best answer. The text fields are escaped using double quotes ("), and any internal double quote is escaped by 2 double quotes (""). New lines are escaped by a backslash followed with an "n" character, that is "\n". | [
"### Description\nThis dataset is an instruct style dataset comprised of a scrape of the Yahoo! Answers website that was done in 2007. The dataset is comprised of 10 categories labeled 1-10. The categories are as follows:\n\n1. Society & Culture\n2. Science & Mathematics\n3. Health\n4. Education & Reference\n5. Computers & Internet\n6. Sports\n7. Business & Finance\n8. Entertainment & Music\n9. Family & Relationships\n10. Politics & Government\n\nThe subject line and body of the question have been combined into a single field and separated by a newline character.\n\nI would caution against using this dataset for any serious application as it contains hilariously out of date information, offensive language, and frequent spelling and grammar errors. It is, however, a charming snapshot of the internet in 2007.\n\nRoughly 228m llama tokens in 1.4m samples",
"### Original README\n>Yahoo! Answers Topic Classification Dataset\n>\n>Version 2, Updated 09/09/2015\n>\n>\n>ORIGIN\n>\n>The original Yahoo! Answers corpus can be obtained through the Yahoo! Research Alliance Webscope program. The dataset is to be used for approved non-commercial research purposes by recipients who have signed a Data Sharing Agreement with Yahoo!. The dataset is the Yahoo! Answers corpus as of 10/25/2007. It includes all the questions and their corresponding answers. The corpus contains 4483032 questions and their answers. \n>\n>The Yahoo! Answers topic classification dataset is constructed by Xiang Zhang (URL@URL) from the above dataset. It is used as a text classification benchmark in the following paper: Xiang Zhang, Junbo Zhao, Yann LeCun. Character-level Convolutional Networks for Text Classification. Advances in Neural Information Processing Systems 28 (NIPS 2015).\n>\n>\n>DESCRIPTION\n>\n>The Yahoo! Answers topic classification dataset is constructed using 10 largest main categories. Each class contains 140,000 training samples and 6,000 testing samples. Therefore, the total number of training samples is 1,400,000 and testing samples 60,000 in this dataset. From all the answers and other meta-information, we only used the best answer content and the main category information.\n>\n>The file URL contains a list of classes corresponding to each label. \n>\n>The files URL and URL contain all the training samples as comma-sparated values. There are 4 columns in them, corresponding to class index (1 to 10), question title, question content and best answer. The text fields are escaped using double quotes (\"), and any internal double quote is escaped by 2 double quotes (\"\"). New lines are escaped by a backslash followed with an \"n\" character, that is \"\\n\"."
]
| [
"TAGS\n#task_categories-question-answering #size_categories-1M<n<10M #language-English #not-for-all-audiences #alpaca #region-us \n",
"### Description\nThis dataset is an instruct style dataset comprised of a scrape of the Yahoo! Answers website that was done in 2007. The dataset is comprised of 10 categories labeled 1-10. The categories are as follows:\n\n1. Society & Culture\n2. Science & Mathematics\n3. Health\n4. Education & Reference\n5. Computers & Internet\n6. Sports\n7. Business & Finance\n8. Entertainment & Music\n9. Family & Relationships\n10. Politics & Government\n\nThe subject line and body of the question have been combined into a single field and separated by a newline character.\n\nI would caution against using this dataset for any serious application as it contains hilariously out of date information, offensive language, and frequent spelling and grammar errors. It is, however, a charming snapshot of the internet in 2007.\n\nRoughly 228m llama tokens in 1.4m samples",
"### Original README\n>Yahoo! Answers Topic Classification Dataset\n>\n>Version 2, Updated 09/09/2015\n>\n>\n>ORIGIN\n>\n>The original Yahoo! Answers corpus can be obtained through the Yahoo! Research Alliance Webscope program. The dataset is to be used for approved non-commercial research purposes by recipients who have signed a Data Sharing Agreement with Yahoo!. The dataset is the Yahoo! Answers corpus as of 10/25/2007. It includes all the questions and their corresponding answers. The corpus contains 4483032 questions and their answers. \n>\n>The Yahoo! Answers topic classification dataset is constructed by Xiang Zhang (URL@URL) from the above dataset. It is used as a text classification benchmark in the following paper: Xiang Zhang, Junbo Zhao, Yann LeCun. Character-level Convolutional Networks for Text Classification. Advances in Neural Information Processing Systems 28 (NIPS 2015).\n>\n>\n>DESCRIPTION\n>\n>The Yahoo! Answers topic classification dataset is constructed using 10 largest main categories. Each class contains 140,000 training samples and 6,000 testing samples. Therefore, the total number of training samples is 1,400,000 and testing samples 60,000 in this dataset. From all the answers and other meta-information, we only used the best answer content and the main category information.\n>\n>The file URL contains a list of classes corresponding to each label. \n>\n>The files URL and URL contain all the training samples as comma-sparated values. There are 4 columns in them, corresponding to class index (1 to 10), question title, question content and best answer. The text fields are escaped using double quotes (\"), and any internal double quote is escaped by 2 double quotes (\"\"). New lines are escaped by a backslash followed with an \"n\" character, that is \"\\n\"."
]
|
f0bf033553f865677b2cbc167d1e18a4f4b09f71 |
<p align="center">
<img src="https://huggingface.co/datasets/galsenai/french-wolof-translation/resolve/main/distribution.png" />
</p>
| galsenai/french-wolof-translation | [
"region:us"
]
| 2023-12-07T13:28:27+00:00 | {"dataset_info": {"features": [{"name": "translation", "struct": [{"name": "fr", "dtype": "string"}, {"name": "wo", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 2006697, "num_examples": 7263}, {"name": "dev", "num_bytes": 493299, "num_examples": 1816}, {"name": "test", "num_bytes": 632727, "num_examples": 2270}], "download_size": 2224550, "dataset_size": 3132723}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "dev", "path": "data/dev-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-14T12:12:05+00:00 | []
| []
| TAGS
#region-us
|
<p align="center">
<img src="URL />
</p>
| []
| [
"TAGS\n#region-us \n"
]
|
1a9957cbcbe46ee7152cfd940b92cb80c5269104 |
## Dataset Description
Microsoft created the methods2test dataset, consisting of Java Junit test cases with its corresponding focal methods.
It contains 780k pairs of JUnit test cases and focal methods which were extracted from a total of 91K
Java open source project hosted on GitHub.
This is an assembled version of the methods2test dataset. It provides convenient access to the different context levels based on the raw source code (e.g. newlines are preserved). The test cases and associated classes are also made available.
The mapping between test case and focal methods are based heuristics rules and Java developer's best practice.
More information could be found here:
- [methods2test Github repo](https://github.com/microsoft/methods2test)
- [Methods2Test: A dataset of focal methods mapped to test cases](https://arxiv.org/pdf/2203.12776.pdf)
## Dataset Schema
```
t: <TEST_CASE>
t_tc: <TEST_CASE> <TEST_CLASS_NAME>
fm: <FOCAL_METHOD>
fm_fc: <FOCAL_CLASS_NAME> <FOCAL_METHOD>
fm_fc_c: <FOCAL_CLASS_NAME> <FOCAL_METHOD> <CONTRSUCTORS>
fm_fc_c_m: <FOCAL_CLASS_NAME> <FOCAL_METHOD> <CONTRSUCTORS> <METHOD_SIGNATURES>
fm_fc_c_m_f: <FOCAL_CLASS_NAME> <FOCAL_METHOD> <CONTRSUCTORS> <METHOD_SIGNATURES> <FIELDS>
```
## Focal Context
- fm: this representation incorporates exclusively the source
code of the focal method. Intuitively, this contains the most
important information for generating accurate test cases for
the given method.
- fm+fc: this representations adds the focal class name, which
can provide meaningful semantic information to the model.
- fm+fc+c: this representation adds the signatures of the constructor methods of the focal class. The idea behind this
augmentation is that the test case may require instantiating
an object of the focal class in order to properly test the focal
method.
- fm+fc+c+m: this representation adds the signatures of the
other public methods in the focal class. The rationale which
motivated this inclusion is that the test case may need to
invoke other auxiliary methods within the class (e.g., getters,
setters) to set up or tear down the testing environment.
- fm+fc+c+m+f : this representation adds the public fields of
the focal class. The motivation is that test cases may need to
inspect the status of the public fields to properly test a focal
method.

The different levels of focal contexts are the following:
```
T: test case
T_TC: test case + test class name
FM: focal method
FM_FC: focal method + focal class name
FM_FC_C: focal method + focal class name + constructor signatures
FM_FC_C_M: focal method + focal class name + constructor signatures + public method signatures
FM_FC_C_M_F: focal method + focal class name + constructor signatures + public method signatures + public fields
```
## Limitations
The original authors validate the heuristics by inspecting a
statistically significant sample (confidence level of 95% within 10%
margin of error) of 97 samples from the training set. Two authors
independently evaluated the sample, then met to discuss the disagreements. We found that 90.72% of the samples have a correct
link between the test case and the corresponding focal method
## Contribution
All thanks to the original authors.
| andstor/methods2test | [
"task_categories:text-generation",
"language:en",
"license:mit",
"unit test",
"java",
"code",
"arxiv:2203.12776",
"region:us"
]
| 2023-12-07T13:37:44+00:00 | {"language": ["en"], "license": "mit", "task_categories": ["text-generation"], "configs": [{"config_name": "fm", "data_files": [{"split": "train", "path": "data/fm/train-*"}, {"split": "test", "path": "data/fm/test-*"}, {"split": "validation", "path": "data/fm/validation-*"}]}, {"config_name": "fm_indented", "data_files": [{"split": "train", "path": "data/fm_indented/train-*"}, {"split": "test", "path": "data/fm_indented/test-*"}, {"split": "validation", "path": "data/fm_indented/validation-*"}]}, {"config_name": "fm+t", "data_files": [{"split": "train", "path": "data/fm+t/train-*"}, {"split": "test", "path": "data/fm+t/test-*"}, {"split": "validation", "path": "data/fm+t/validation-*"}]}, {"config_name": "fm+fc", "data_files": [{"split": "train", "path": "data/fm+fc/train-*"}, {"split": "test", "path": "data/fm+fc/test-*"}, {"split": "validation", "path": "data/fm+fc/validation-*"}]}, {"config_name": "fm+fc+t+tc", "data_files": [{"split": "train", "path": "data/fm+fc+t+tc/train-*"}, {"split": "test", "path": "data/fm+fc+t+tc/test-*"}, {"split": "validation", "path": "data/fm+fc+t+tc/validation-*"}]}, {"config_name": "fm+fc+c", "data_files": [{"split": "train", "path": "data/fm+fc+c/train-*"}, {"split": "test", "path": "data/fm+fc+c/test-*"}, {"split": "validation", "path": "data/fm+fc+c/validation-*"}]}, {"config_name": "fm+fc+c+t+tc", "data_files": [{"split": "train", "path": "data/fm+fc+c+t+tc/train-*"}, {"split": "test", "path": "data/fm+fc+c+t+tc/test-*"}, {"split": "validation", "path": "data/fm+fc+c+t+tc/validation-*"}]}, {"config_name": "fm+fc+c+m", "data_files": [{"split": "train", "path": "data/fm+fc+c+m/train-*"}, {"split": "test", "path": "data/fm+fc+c+m/test-*"}, {"split": "validation", "path": "data/fm+fc+c+m/validation-*"}]}, {"config_name": "fm+fc+c+m+t+tc", "data_files": [{"split": "train", "path": "data/fm+fc+c+m+t+tc/train-*"}, {"split": "test", "path": "data/fm+fc+c+m+t+tc/test-*"}, {"split": "validation", "path": "data/fm+fc+c+m+t+tc/validation-*"}]}, {"config_name": "fm+fc+c+m+f", "data_files": [{"split": "train", "path": "data/fm+fc+c+m+f/train-*"}, {"split": "test", "path": "data/fm+fc+c+m+f/test-*"}, {"split": "validation", "path": "data/fm+fc+c+m+f/validation-*"}]}, {"config_name": "fm+fc+c+m+f+t+tc", "data_files": [{"split": "train", "path": "data/fm+fc+c+m+f+t+tc/train-*"}, {"split": "test", "path": "data/fm+fc+c+m+f+t+tc/test-*"}, {"split": "validation", "path": "data/fm+fc+c+m+f+t+tc/validation-*"}]}, {"config_name": "t", "data_files": [{"split": "train", "path": "data/t/train-*"}, {"split": "test", "path": "data/t/test-*"}, {"split": "validation", "path": "data/t/validation-*"}]}, {"config_name": "t_indented", "data_files": [{"split": "train", "path": "data/t_indented/train-*"}, {"split": "test", "path": "data/t_indented/test-*"}, {"split": "validation", "path": "data/t_indented/validation-*"}]}, {"config_name": "t+tc", "data_files": [{"split": "train", "path": "data/t+tc/train-*"}, {"split": "test", "path": "data/t+tc/test-*"}, {"split": "validation", "path": "data/t+tc/validation-*"}]}], "dataset_info": [{"config_name": "fm", "features": [{"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 440444124, "num_examples": 624022}, {"name": "test", "num_bytes": 59407291, "num_examples": 78388}, {"name": "validation", "num_bytes": 57170315, "num_examples": 78534}], "download_size": 99172217, "dataset_size": 557021730}, {"config_name": "fm+fc", "features": [{"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 506130678, "num_examples": 624022}, {"name": "test", "num_bytes": 68407490, "num_examples": 78388}, {"name": "validation", "num_bytes": 65318956, "num_examples": 78534}], "download_size": 109141139, "dataset_size": 639857124}, {"config_name": "fm+fc+c", "features": [{"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 569209100, "num_examples": 624022}, {"name": "test", "num_bytes": 75552573, "num_examples": 78388}, {"name": "validation", "num_bytes": 73101169, "num_examples": 78534}], "download_size": 117996353, "dataset_size": 717862842}, {"config_name": "fm+fc+c+m", "features": [{"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1354004338, "num_examples": 624022}, {"name": "test", "num_bytes": 187724929, "num_examples": 78388}, {"name": "validation", "num_bytes": 184349299, "num_examples": 78534}], "download_size": 222922572, "dataset_size": 1726078566}, {"config_name": "fm+fc+c+m+f", "features": [{"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1476073209, "num_examples": 624022}, {"name": "test", "num_bytes": 201686811, "num_examples": 78388}, {"name": "validation", "num_bytes": 201259950, "num_examples": 78534}], "download_size": 240405885, "dataset_size": 1879019970}, {"config_name": "fm+fc+c+m+f+t+tc", "features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2019918359, "num_examples": 624022}, {"name": "test", "num_bytes": 269021331, "num_examples": 78388}, {"name": "validation", "num_bytes": 272958781, "num_examples": 78534}], "download_size": 371500476, "dataset_size": 2561898471}, {"config_name": "fm+fc+c+m+t+tc", "features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1897682665, "num_examples": 624022}, {"name": "test", "num_bytes": 255053799, "num_examples": 78388}, {"name": "validation", "num_bytes": 256030595, "num_examples": 78534}], "download_size": 360175965, "dataset_size": 2408767059}, {"config_name": "fm+fc+c+t+tc", "features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1109827485, "num_examples": 624022}, {"name": "test", "num_bytes": 142558255, "num_examples": 78388}, {"name": "validation", "num_bytes": 144523616, "num_examples": 78534}], "download_size": 251861137, "dataset_size": 1396909356}, {"config_name": "fm+fc+t+tc", "features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1046592848, "num_examples": 624022}, {"name": "test", "num_bytes": 135403379, "num_examples": 78388}, {"name": "validation", "num_bytes": 136729952, "num_examples": 78534}], "download_size": 243052074, "dataset_size": 1318726179}, {"config_name": "fm+t", "features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 868034154, "num_examples": 624022}, {"name": "test", "num_bytes": 114371187, "num_examples": 78388}, {"name": "validation", "num_bytes": 112688219, "num_examples": 78534}], "download_size": 217267853, "dataset_size": 1095093560}, {"config_name": "fm_indented", "features": [{"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 473170158, "num_examples": 624022}, {"name": "test", "num_bytes": 64280367, "num_examples": 78388}, {"name": "validation", "num_bytes": 61093848, "num_examples": 78534}], "download_size": 103174190, "dataset_size": 598544373}, {"config_name": "t", "features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 380207303, "num_examples": 624022}, {"name": "test", "num_bytes": 47993188, "num_examples": 78388}, {"name": "validation", "num_bytes": 49808813, "num_examples": 78534}], "download_size": 113820250, "dataset_size": 478009304}, {"config_name": "t+tc", "features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 550955294, "num_examples": 624022}, {"name": "test", "num_bytes": 68323462, "num_examples": 78388}, {"name": "validation", "num_bytes": 72740770, "num_examples": 78534}], "download_size": 136767271, "dataset_size": 692019526}, {"config_name": "t_indented", "features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 405853738, "num_examples": 624022}, {"name": "test", "num_bytes": 51457514, "num_examples": 78388}, {"name": "validation", "num_bytes": 52970428, "num_examples": 78534}], "download_size": 117732776, "dataset_size": 510281680}], "tags": ["unit test", "java", "code"]} | 2023-12-23T03:01:51+00:00 | [
"2203.12776"
]
| [
"en"
]
| TAGS
#task_categories-text-generation #language-English #license-mit #unit test #java #code #arxiv-2203.12776 #region-us
|
## Dataset Description
Microsoft created the methods2test dataset, consisting of Java Junit test cases with its corresponding focal methods.
It contains 780k pairs of JUnit test cases and focal methods which were extracted from a total of 91K
Java open source project hosted on GitHub.
This is an assembled version of the methods2test dataset. It provides convenient access to the different context levels based on the raw source code (e.g. newlines are preserved). The test cases and associated classes are also made available.
The mapping between test case and focal methods are based heuristics rules and Java developer's best practice.
More information could be found here:
- methods2test Github repo
- Methods2Test: A dataset of focal methods mapped to test cases
## Dataset Schema
## Focal Context
- fm: this representation incorporates exclusively the source
code of the focal method. Intuitively, this contains the most
important information for generating accurate test cases for
the given method.
- fm+fc: this representations adds the focal class name, which
can provide meaningful semantic information to the model.
- fm+fc+c: this representation adds the signatures of the constructor methods of the focal class. The idea behind this
augmentation is that the test case may require instantiating
an object of the focal class in order to properly test the focal
method.
- fm+fc+c+m: this representation adds the signatures of the
other public methods in the focal class. The rationale which
motivated this inclusion is that the test case may need to
invoke other auxiliary methods within the class (e.g., getters,
setters) to set up or tear down the testing environment.
- fm+fc+c+m+f : this representation adds the public fields of
the focal class. The motivation is that test cases may need to
inspect the status of the public fields to properly test a focal
method.
!image/png
The different levels of focal contexts are the following:
## Limitations
The original authors validate the heuristics by inspecting a
statistically significant sample (confidence level of 95% within 10%
margin of error) of 97 samples from the training set. Two authors
independently evaluated the sample, then met to discuss the disagreements. We found that 90.72% of the samples have a correct
link between the test case and the corresponding focal method
## Contribution
All thanks to the original authors.
| [
"## Dataset Description\nMicrosoft created the methods2test dataset, consisting of Java Junit test cases with its corresponding focal methods. \nIt contains 780k pairs of JUnit test cases and focal methods which were extracted from a total of 91K\nJava open source project hosted on GitHub.\n\nThis is an assembled version of the methods2test dataset. It provides convenient access to the different context levels based on the raw source code (e.g. newlines are preserved). The test cases and associated classes are also made available.\n\nThe mapping between test case and focal methods are based heuristics rules and Java developer's best practice.\n\nMore information could be found here:\n- methods2test Github repo\n- Methods2Test: A dataset of focal methods mapped to test cases",
"## Dataset Schema",
"## Focal Context\n- fm: this representation incorporates exclusively the source\ncode of the focal method. Intuitively, this contains the most\nimportant information for generating accurate test cases for\nthe given method.\n- fm+fc: this representations adds the focal class name, which\ncan provide meaningful semantic information to the model.\n- fm+fc+c: this representation adds the signatures of the constructor methods of the focal class. The idea behind this\naugmentation is that the test case may require instantiating\nan object of the focal class in order to properly test the focal\nmethod.\n- fm+fc+c+m: this representation adds the signatures of the\nother public methods in the focal class. The rationale which\nmotivated this inclusion is that the test case may need to\ninvoke other auxiliary methods within the class (e.g., getters,\nsetters) to set up or tear down the testing environment.\n- fm+fc+c+m+f : this representation adds the public fields of\nthe focal class. The motivation is that test cases may need to\ninspect the status of the public fields to properly test a focal\nmethod.\n\n!image/png\n\n The different levels of focal contexts are the following:",
"## Limitations\nThe original authors validate the heuristics by inspecting a\nstatistically significant sample (confidence level of 95% within 10%\nmargin of error) of 97 samples from the training set. Two authors\nindependently evaluated the sample, then met to discuss the disagreements. We found that 90.72% of the samples have a correct\nlink between the test case and the corresponding focal method",
"## Contribution\nAll thanks to the original authors."
]
| [
"TAGS\n#task_categories-text-generation #language-English #license-mit #unit test #java #code #arxiv-2203.12776 #region-us \n",
"## Dataset Description\nMicrosoft created the methods2test dataset, consisting of Java Junit test cases with its corresponding focal methods. \nIt contains 780k pairs of JUnit test cases and focal methods which were extracted from a total of 91K\nJava open source project hosted on GitHub.\n\nThis is an assembled version of the methods2test dataset. It provides convenient access to the different context levels based on the raw source code (e.g. newlines are preserved). The test cases and associated classes are also made available.\n\nThe mapping between test case and focal methods are based heuristics rules and Java developer's best practice.\n\nMore information could be found here:\n- methods2test Github repo\n- Methods2Test: A dataset of focal methods mapped to test cases",
"## Dataset Schema",
"## Focal Context\n- fm: this representation incorporates exclusively the source\ncode of the focal method. Intuitively, this contains the most\nimportant information for generating accurate test cases for\nthe given method.\n- fm+fc: this representations adds the focal class name, which\ncan provide meaningful semantic information to the model.\n- fm+fc+c: this representation adds the signatures of the constructor methods of the focal class. The idea behind this\naugmentation is that the test case may require instantiating\nan object of the focal class in order to properly test the focal\nmethod.\n- fm+fc+c+m: this representation adds the signatures of the\nother public methods in the focal class. The rationale which\nmotivated this inclusion is that the test case may need to\ninvoke other auxiliary methods within the class (e.g., getters,\nsetters) to set up or tear down the testing environment.\n- fm+fc+c+m+f : this representation adds the public fields of\nthe focal class. The motivation is that test cases may need to\ninspect the status of the public fields to properly test a focal\nmethod.\n\n!image/png\n\n The different levels of focal contexts are the following:",
"## Limitations\nThe original authors validate the heuristics by inspecting a\nstatistically significant sample (confidence level of 95% within 10%\nmargin of error) of 97 samples from the training set. Two authors\nindependently evaluated the sample, then met to discuss the disagreements. We found that 90.72% of the samples have a correct\nlink between the test case and the corresponding focal method",
"## Contribution\nAll thanks to the original authors."
]
|
046857f0c5f8d525d12e920e251fcfb55dc85c5e |
# DAC693k
## Description
This dataset, named "DAC693k," is designed for ad targeting in a multi-class classification setting. It consists of two main columns: "domain" and "classes." The "domain" column contains a list of domains, representing various websites or online entities. The "classes" column contains an array representation of ad targeting multi-classes associated with each domain.
## Usage
### Hugging Face Datasets Library
The dataset is formatted to be seamlessly integrated with Hugging Face's datasets library. Users can easily load the dataset using the following code:
```python
from datasets import load_dataset
# Load the dataset
dataset = load_dataset("ansi-code/domain-advertising-classes-693k")
```
## Columns
- domain: This column contains the domains of websites or online entities.
- classes: The "classes" column represents an array of multi-class labels associated with each domain for ad targeting. (see here mapping https://github.com/patcg-individual-drafts/topics/blob/main/taxonomy_v1.md)
## Data Format
- domain: String
- classes: List of strings representing multi-class labels
## License
This dataset is released under the Apache 2.0 license.
## Citation
If you use this dataset in your work, please cite it using the following BibTeX entry:
```bibtex
@dataset{silvi-2023-dac693k,
title = {domain-advertising-classes-693k},
author = {Andrea Silvi},
year = {2023},
}
```
## Acknowledgements
Additionally, we acknowledge the usage of the ad targeting taxonomy provided in [this GitHub repository](https://github.com/patcg-individual-drafts/topics/). The taxonomy has been instrumental in organizing and labeling the multi-class targets associated with each domain in the dataset.
| ansi-code/domain-advertising-classes-693k | [
"task_categories:text-classification",
"size_categories:100K<n<1M",
"language:en",
"license:apache-2.0",
"advertising",
"region:us"
]
| 2023-12-07T13:53:44+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-classification"], "pretty_name": "DAC693k", "tags": ["advertising"]} | 2023-12-07T14:11:04+00:00 | []
| [
"en"
]
| TAGS
#task_categories-text-classification #size_categories-100K<n<1M #language-English #license-apache-2.0 #advertising #region-us
|
# DAC693k
## Description
This dataset, named "DAC693k," is designed for ad targeting in a multi-class classification setting. It consists of two main columns: "domain" and "classes." The "domain" column contains a list of domains, representing various websites or online entities. The "classes" column contains an array representation of ad targeting multi-classes associated with each domain.
## Usage
### Hugging Face Datasets Library
The dataset is formatted to be seamlessly integrated with Hugging Face's datasets library. Users can easily load the dataset using the following code:
## Columns
- domain: This column contains the domains of websites or online entities.
- classes: The "classes" column represents an array of multi-class labels associated with each domain for ad targeting. (see here mapping URL
## Data Format
- domain: String
- classes: List of strings representing multi-class labels
## License
This dataset is released under the Apache 2.0 license.
If you use this dataset in your work, please cite it using the following BibTeX entry:
## Acknowledgements
Additionally, we acknowledge the usage of the ad targeting taxonomy provided in this GitHub repository. The taxonomy has been instrumental in organizing and labeling the multi-class targets associated with each domain in the dataset.
| [
"# DAC693k",
"## Description\n\nThis dataset, named \"DAC693k,\" is designed for ad targeting in a multi-class classification setting. It consists of two main columns: \"domain\" and \"classes.\" The \"domain\" column contains a list of domains, representing various websites or online entities. The \"classes\" column contains an array representation of ad targeting multi-classes associated with each domain.",
"## Usage",
"### Hugging Face Datasets Library\n\nThe dataset is formatted to be seamlessly integrated with Hugging Face's datasets library. Users can easily load the dataset using the following code:",
"## Columns\n- domain: This column contains the domains of websites or online entities.\n- classes: The \"classes\" column represents an array of multi-class labels associated with each domain for ad targeting. (see here mapping URL",
"## Data Format\n- domain: String\n- classes: List of strings representing multi-class labels",
"## License\nThis dataset is released under the Apache 2.0 license.\n\nIf you use this dataset in your work, please cite it using the following BibTeX entry:",
"## Acknowledgements\nAdditionally, we acknowledge the usage of the ad targeting taxonomy provided in this GitHub repository. The taxonomy has been instrumental in organizing and labeling the multi-class targets associated with each domain in the dataset."
]
| [
"TAGS\n#task_categories-text-classification #size_categories-100K<n<1M #language-English #license-apache-2.0 #advertising #region-us \n",
"# DAC693k",
"## Description\n\nThis dataset, named \"DAC693k,\" is designed for ad targeting in a multi-class classification setting. It consists of two main columns: \"domain\" and \"classes.\" The \"domain\" column contains a list of domains, representing various websites or online entities. The \"classes\" column contains an array representation of ad targeting multi-classes associated with each domain.",
"## Usage",
"### Hugging Face Datasets Library\n\nThe dataset is formatted to be seamlessly integrated with Hugging Face's datasets library. Users can easily load the dataset using the following code:",
"## Columns\n- domain: This column contains the domains of websites or online entities.\n- classes: The \"classes\" column represents an array of multi-class labels associated with each domain for ad targeting. (see here mapping URL",
"## Data Format\n- domain: String\n- classes: List of strings representing multi-class labels",
"## License\nThis dataset is released under the Apache 2.0 license.\n\nIf you use this dataset in your work, please cite it using the following BibTeX entry:",
"## Acknowledgements\nAdditionally, we acknowledge the usage of the ad targeting taxonomy provided in this GitHub repository. The taxonomy has been instrumental in organizing and labeling the multi-class targets associated with each domain in the dataset."
]
|
6fc43931d884a108b5882be0443a542a9a2a07a6 | Multilingual instructional dataset for extractive QA finetunning for Catalan, Spanish and English, using SQAD, SQAC, CatalanQA and COQCAT datasets.
Contains almost 55K questions, answers and contexts, with development and train splits. | crodri/multilingual_qa | [
"task_categories:question-answering",
"size_categories:10K<n<100K",
"language:ca",
"language:es",
"language:en",
"license:mit",
"region:us"
]
| 2023-12-07T14:00:37+00:00 | {"language": ["ca", "es", "en"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering"], "pretty_name": "multilingual_qa"} | 2023-12-07T14:05:14+00:00 | []
| [
"ca",
"es",
"en"
]
| TAGS
#task_categories-question-answering #size_categories-10K<n<100K #language-Catalan #language-Spanish #language-English #license-mit #region-us
| Multilingual instructional dataset for extractive QA finetunning for Catalan, Spanish and English, using SQAD, SQAC, CatalanQA and COQCAT datasets.
Contains almost 55K questions, answers and contexts, with development and train splits. | []
| [
"TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #language-Catalan #language-Spanish #language-English #license-mit #region-us \n"
]
|
d5906d214a534e2d157f650c9acabda66a3fe774 | dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 19558.4
num_examples: 55
download_size: 19558.4
dataset_size: 19558.4
configs:
- config_name: default
data_files:
- split: train
path: Article1.jsonl | andrewatef/PRO2 | [
"region:us"
]
| 2023-12-07T14:08:38+00:00 | {} | 2023-12-13T20:20:27+00:00 | []
| []
| TAGS
#region-us
| dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 19558.4
num_examples: 55
download_size: 19558.4
dataset_size: 19558.4
configs:
- config_name: default
data_files:
- split: train
path: URL | []
| [
"TAGS\n#region-us \n"
]
|
7e53c8b60e86f6d48d6c4e8009d1413b6a29f3bb |
## Description
🍙 Onigiri Network is the first AI-generated anime channel.
## Model
HotshotXL
## LoRA
KappaNeuro/studio-ghibli-style
## Style
Studio Ghibli Style
## Voice
Cloée
## Prompt
A video channel which produces anime episodes of fictional franchises.
Stories should be short, about 1 minute or 2, but full of action and fun.
It will NEVER create content from existing artists or studio.
Instead it will create its own artistic stories, content, stories and characters.
Characters should not be biased towards any specific gender, work, country, religion or culture.
Some themes should be avoided in the stories, such as violence, sex, war, crimes etc.
No nudity is tolerated.
However adventure, mystery, sci-fi, fantasy, soft horror (witches etc) etc are allowed, as well as very mild violence, such as martial arts, stories about samurais, ninjas etc.
| jbilcke-hf/ai-tube-onigiri-network | [
"license:cc-by-nc-sa-4.0",
"region:us"
]
| 2023-12-07T14:20:40+00:00 | {"license": "cc-by-nc-sa-4.0", "pretty_name": "Onigiri \ud83c\udf59"} | 2023-12-08T11:49:51+00:00 | []
| []
| TAGS
#license-cc-by-nc-sa-4.0 #region-us
|
## Description
Onigiri Network is the first AI-generated anime channel.
## Model
HotshotXL
## LoRA
KappaNeuro/studio-ghibli-style
## Style
Studio Ghibli Style
## Voice
Cloée
## Prompt
A video channel which produces anime episodes of fictional franchises.
Stories should be short, about 1 minute or 2, but full of action and fun.
It will NEVER create content from existing artists or studio.
Instead it will create its own artistic stories, content, stories and characters.
Characters should not be biased towards any specific gender, work, country, religion or culture.
Some themes should be avoided in the stories, such as violence, sex, war, crimes etc.
No nudity is tolerated.
However adventure, mystery, sci-fi, fantasy, soft horror (witches etc) etc are allowed, as well as very mild violence, such as martial arts, stories about samurais, ninjas etc.
| [
"## Description\n\n Onigiri Network is the first AI-generated anime channel.",
"## Model\n\nHotshotXL",
"## LoRA\n\nKappaNeuro/studio-ghibli-style",
"## Style\n\nStudio Ghibli Style",
"## Voice\n\nCloée",
"## Prompt\n\nA video channel which produces anime episodes of fictional franchises.\nStories should be short, about 1 minute or 2, but full of action and fun.\nIt will NEVER create content from existing artists or studio.\nInstead it will create its own artistic stories, content, stories and characters.\nCharacters should not be biased towards any specific gender, work, country, religion or culture.\nSome themes should be avoided in the stories, such as violence, sex, war, crimes etc.\nNo nudity is tolerated.\nHowever adventure, mystery, sci-fi, fantasy, soft horror (witches etc) etc are allowed, as well as very mild violence, such as martial arts, stories about samurais, ninjas etc."
]
| [
"TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n",
"## Description\n\n Onigiri Network is the first AI-generated anime channel.",
"## Model\n\nHotshotXL",
"## LoRA\n\nKappaNeuro/studio-ghibli-style",
"## Style\n\nStudio Ghibli Style",
"## Voice\n\nCloée",
"## Prompt\n\nA video channel which produces anime episodes of fictional franchises.\nStories should be short, about 1 minute or 2, but full of action and fun.\nIt will NEVER create content from existing artists or studio.\nInstead it will create its own artistic stories, content, stories and characters.\nCharacters should not be biased towards any specific gender, work, country, religion or culture.\nSome themes should be avoided in the stories, such as violence, sex, war, crimes etc.\nNo nudity is tolerated.\nHowever adventure, mystery, sci-fi, fantasy, soft horror (witches etc) etc are allowed, as well as very mild violence, such as martial arts, stories about samurais, ninjas etc."
]
|
ab60f34c9ada0ec54d7afa21737730b0bf529f0c |
# ShareGPT4V-PT-ru
## Dataset Description
- **Paper:** https://huggingface.co/papers/2311.12793
- **Repository** https://github.com/InternLM/InternLM-XComposer/tree/main/projects/ShareGPT4V
- **Homepage** https://ShareGPT4V.github.io/
This is translated version of **ShareGPT4V-PT(1.25M)** subset of [Lin-Chen/ShareGPT4V](https://huggingface.co/datasets/Lin-Chen/ShareGPT4V) dataset into Russian. | d0rj/ShareGPT4V-PT-ru | [
"task_categories:visual-question-answering",
"task_categories:question-answering",
"task_categories:conversational",
"language_creators:translated",
"multilinguality:monolingual",
"size_categories:1M<n<10M",
"source_datasets:Lin-Chen/ShareGPT4V",
"language:ru",
"license:cc-by-nc-4.0",
"chat",
"visual-chat",
"multimodal-chat",
"arxiv:2311.12793",
"region:us"
]
| 2023-12-07T14:23:58+00:00 | {"language_creators": ["translated"], "language": ["ru"], "license": "cc-by-nc-4.0", "multilinguality": ["monolingual"], "size_categories": ["1M<n<10M"], "source_datasets": ["Lin-Chen/ShareGPT4V"], "task_categories": ["visual-question-answering", "question-answering", "conversational"], "paperswithcode_id": "sharegpt4v", "pretty_name": "ShareGPT4V (ru)", "dataset_info": {"features": [{"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "image", "dtype": "string"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1315012350, "num_examples": 1246901}], "download_size": 554413159, "dataset_size": 1315012350}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["chat", "visual-chat", "multimodal-chat"]} | 2023-12-08T07:50:21+00:00 | [
"2311.12793"
]
| [
"ru"
]
| TAGS
#task_categories-visual-question-answering #task_categories-question-answering #task_categories-conversational #language_creators-translated #multilinguality-monolingual #size_categories-1M<n<10M #source_datasets-Lin-Chen/ShareGPT4V #language-Russian #license-cc-by-nc-4.0 #chat #visual-chat #multimodal-chat #arxiv-2311.12793 #region-us
|
# ShareGPT4V-PT-ru
## Dataset Description
- Paper: URL
- Repository URL
- Homepage URL
This is translated version of ShareGPT4V-PT(1.25M) subset of Lin-Chen/ShareGPT4V dataset into Russian. | [
"# ShareGPT4V-PT-ru",
"## Dataset Description\n\n- Paper: URL\n- Repository URL\n- Homepage URL\n\nThis is translated version of ShareGPT4V-PT(1.25M) subset of Lin-Chen/ShareGPT4V dataset into Russian."
]
| [
"TAGS\n#task_categories-visual-question-answering #task_categories-question-answering #task_categories-conversational #language_creators-translated #multilinguality-monolingual #size_categories-1M<n<10M #source_datasets-Lin-Chen/ShareGPT4V #language-Russian #license-cc-by-nc-4.0 #chat #visual-chat #multimodal-chat #arxiv-2311.12793 #region-us \n",
"# ShareGPT4V-PT-ru",
"## Dataset Description\n\n- Paper: URL\n- Repository URL\n- Homepage URL\n\nThis is translated version of ShareGPT4V-PT(1.25M) subset of Lin-Chen/ShareGPT4V dataset into Russian."
]
|
7a61a3e7612fa5ce561622316845df058b5a7a6c |
## Description
Follow us to experience a new era of 3D AI-animated short films.
## Model
SVD
## LoRA
artificialguybr/3DRedmond-V1
## Style
- 3D Render Style
- 3DRenderAF
## Voice
Cloée
## Music
Background cinematic music
## Prompt
A video channel which produces short 3D films of fictional franchises.
Stories should be short, about 1 minute or 2, but full of action and fun.
It will NEVER create content from existing artists or studio.
Instead it will create its own artistic stories, content, stories and characters.
Characters should not be biased towards any specific gender, work, country, religion or culture.
Some themes should be avoided in the stories, such as violence, sex, war, crimes etc.
No nudity is tolerated.
It should use animal characters whenever possible, but also human characters.
Characters should be very cartoony / cute.
| jbilcke-hf/ai-tube-tensar-studios | [
"license:cc-by-nc-sa-4.0",
"region:us"
]
| 2023-12-07T14:35:32+00:00 | {"license": "cc-by-nc-sa-4.0", "pretty_name": "Tensar Studios"} | 2023-12-12T22:49:08+00:00 | []
| []
| TAGS
#license-cc-by-nc-sa-4.0 #region-us
|
## Description
Follow us to experience a new era of 3D AI-animated short films.
## Model
SVD
## LoRA
artificialguybr/3DRedmond-V1
## Style
- 3D Render Style
- 3DRenderAF
## Voice
Cloée
## Music
Background cinematic music
## Prompt
A video channel which produces short 3D films of fictional franchises.
Stories should be short, about 1 minute or 2, but full of action and fun.
It will NEVER create content from existing artists or studio.
Instead it will create its own artistic stories, content, stories and characters.
Characters should not be biased towards any specific gender, work, country, religion or culture.
Some themes should be avoided in the stories, such as violence, sex, war, crimes etc.
No nudity is tolerated.
It should use animal characters whenever possible, but also human characters.
Characters should be very cartoony / cute.
| [
"## Description\n\nFollow us to experience a new era of 3D AI-animated short films.",
"## Model\n\nSVD",
"## LoRA\n\nartificialguybr/3DRedmond-V1",
"## Style\n\n- 3D Render Style\n- 3DRenderAF",
"## Voice\n\nCloée",
"## Music\n\nBackground cinematic music",
"## Prompt\n\nA video channel which produces short 3D films of fictional franchises.\nStories should be short, about 1 minute or 2, but full of action and fun.\nIt will NEVER create content from existing artists or studio.\nInstead it will create its own artistic stories, content, stories and characters.\nCharacters should not be biased towards any specific gender, work, country, religion or culture.\nSome themes should be avoided in the stories, such as violence, sex, war, crimes etc.\nNo nudity is tolerated.\nIt should use animal characters whenever possible, but also human characters.\nCharacters should be very cartoony / cute."
]
| [
"TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n",
"## Description\n\nFollow us to experience a new era of 3D AI-animated short films.",
"## Model\n\nSVD",
"## LoRA\n\nartificialguybr/3DRedmond-V1",
"## Style\n\n- 3D Render Style\n- 3DRenderAF",
"## Voice\n\nCloée",
"## Music\n\nBackground cinematic music",
"## Prompt\n\nA video channel which produces short 3D films of fictional franchises.\nStories should be short, about 1 minute or 2, but full of action and fun.\nIt will NEVER create content from existing artists or studio.\nInstead it will create its own artistic stories, content, stories and characters.\nCharacters should not be biased towards any specific gender, work, country, religion or culture.\nSome themes should be avoided in the stories, such as violence, sex, war, crimes etc.\nNo nudity is tolerated.\nIt should use animal characters whenever possible, but also human characters.\nCharacters should be very cartoony / cute."
]
|
59624f59760ac9bb48c90265990ba3b3d6b618ee | # Dataset Card for "logo-dataset-40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | kopyl/logo-dataset-40 | [
"region:us"
]
| 2023-12-07T14:54:45+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 369868.0, "num_examples": 40}], "download_size": 380020, "dataset_size": 369868.0}} | 2023-12-07T14:54:48+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "logo-dataset-40"
More Information needed | [
"# Dataset Card for \"logo-dataset-40\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"logo-dataset-40\"\n\nMore Information needed"
]
|
d3e08a5496c180c58afa765e4faf26e6b80e870f | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | PriscilaRubim/Framingham | [
"task_categories:text-generation",
"medical",
"region:us"
]
| 2023-12-07T15:05:37+00:00 | {"task_categories": ["text-generation"], "pretty_name": "Framingham", "tags": ["medical"]} | 2023-12-07T18:36:47+00:00 | []
| []
| TAGS
#task_categories-text-generation #medical #region-us
| # Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
]
| [
"TAGS\n#task_categories-text-generation #medical #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
]
|
09203f6b0191b67157e2cec3a24d15777d5650f2 |
# Financial Tweets
This dataset is a comprehensive collection of all the tweets from my [Discord bot](https://github.com/StephanAkkerman/fintwit-bot) that keeps track of financial influencers on Twitter.
The data includes a variety of information, such as the tweet and the price of the tickers in that tweet at the time of posting.
This dataset can be used for a variety of tasks, such as sentiment analysis and masked language modelling (MLM).
We used this dataset for training our [FinTwitBERT model](https://huggingface.co/StephanAkkerman/FinTwitBERT).
## Overview
This datasets includes all the following three datasets:
- Crypto: https://huggingface.co/datasets/StephanAkkerman/financial-tweets-crypto
- Stocks (and forex): https://huggingface.co/datasets/StephanAkkerman/financial-tweets-stocks
- Other (tweets without cash tags): https://huggingface.co/datasets/StephanAkkerman/financial-tweets-other
## Data Description
The dataset comprises tweets related to financial markets, stocks, and economic discussions.
### Dataset Fields
The data fields are as follows:
* `timestap`: The time the tweet was sent.
* `tweet_text`: All of the text of the tweet, including quoted tweets (prefixed with `>`).
* `tweet_url`: The URL of the tweet.
* `tweet_type`: The type of tweet, this can be tweet, retweet, or quote tweet.
* `price_of_ticker`: The price of the tickers mentioned in USD ($).
* `change_of_ticker`: The 24h price change of the tickers in USD ($).
* `tickers_mentioned`: All the tickers that are mentioned in the tweet.
* `category`: What type of category the tweet has, the suffix `_images` means that the tweet included an image.
## Usage
This dataset is ideal for training and evaluating machine learning models for sentiment analysis, especially those focused on understanding market trends and investor sentiment. It can be used for academic research, financial market analysis, and developing AI tools for financial institutions.
## Acknowledgments
We extend our heartfelt gratitude to all the authors of the original tweets.
## License
This dataset is made available under the MIT license, adhering to the licensing terms of the original datasets. | StephanAkkerman/financial-tweets | [
"task_categories:text-classification",
"size_categories:100K<n<1M",
"language:en",
"license:mit",
"finance",
"twitter",
"news",
"crypto",
"stocks",
"tweet",
"region:us"
]
| 2023-12-07T15:34:00+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["text-classification"], "pretty_name": "Financial Tweets", "tags": ["finance", "twitter", "news", "crypto", "stocks", "tweet"]} | 2023-12-11T11:03:58+00:00 | []
| [
"en"
]
| TAGS
#task_categories-text-classification #size_categories-100K<n<1M #language-English #license-mit #finance #twitter #news #crypto #stocks #tweet #region-us
|
# Financial Tweets
This dataset is a comprehensive collection of all the tweets from my Discord bot that keeps track of financial influencers on Twitter.
The data includes a variety of information, such as the tweet and the price of the tickers in that tweet at the time of posting.
This dataset can be used for a variety of tasks, such as sentiment analysis and masked language modelling (MLM).
We used this dataset for training our FinTwitBERT model.
## Overview
This datasets includes all the following three datasets:
- Crypto: URL
- Stocks (and forex): URL
- Other (tweets without cash tags): URL
## Data Description
The dataset comprises tweets related to financial markets, stocks, and economic discussions.
### Dataset Fields
The data fields are as follows:
* 'timestap': The time the tweet was sent.
* 'tweet_text': All of the text of the tweet, including quoted tweets (prefixed with '>').
* 'tweet_url': The URL of the tweet.
* 'tweet_type': The type of tweet, this can be tweet, retweet, or quote tweet.
* 'price_of_ticker': The price of the tickers mentioned in USD ($).
* 'change_of_ticker': The 24h price change of the tickers in USD ($).
* 'tickers_mentioned': All the tickers that are mentioned in the tweet.
* 'category': What type of category the tweet has, the suffix '_images' means that the tweet included an image.
## Usage
This dataset is ideal for training and evaluating machine learning models for sentiment analysis, especially those focused on understanding market trends and investor sentiment. It can be used for academic research, financial market analysis, and developing AI tools for financial institutions.
## Acknowledgments
We extend our heartfelt gratitude to all the authors of the original tweets.
## License
This dataset is made available under the MIT license, adhering to the licensing terms of the original datasets. | [
"# Financial Tweets\nThis dataset is a comprehensive collection of all the tweets from my Discord bot that keeps track of financial influencers on Twitter.\nThe data includes a variety of information, such as the tweet and the price of the tickers in that tweet at the time of posting.\nThis dataset can be used for a variety of tasks, such as sentiment analysis and masked language modelling (MLM).\n\nWe used this dataset for training our FinTwitBERT model.",
"## Overview\nThis datasets includes all the following three datasets:\n- Crypto: URL\n- Stocks (and forex): URL\n- Other (tweets without cash tags): URL",
"## Data Description\nThe dataset comprises tweets related to financial markets, stocks, and economic discussions.",
"### Dataset Fields\nThe data fields are as follows:\n\n* 'timestap': The time the tweet was sent.\n* 'tweet_text': All of the text of the tweet, including quoted tweets (prefixed with '>').\n* 'tweet_url': The URL of the tweet.\n* 'tweet_type': The type of tweet, this can be tweet, retweet, or quote tweet.\n* 'price_of_ticker': The price of the tickers mentioned in USD ($).\n* 'change_of_ticker': The 24h price change of the tickers in USD ($).\n* 'tickers_mentioned': All the tickers that are mentioned in the tweet.\n* 'category': What type of category the tweet has, the suffix '_images' means that the tweet included an image.",
"## Usage\nThis dataset is ideal for training and evaluating machine learning models for sentiment analysis, especially those focused on understanding market trends and investor sentiment. It can be used for academic research, financial market analysis, and developing AI tools for financial institutions.",
"## Acknowledgments\nWe extend our heartfelt gratitude to all the authors of the original tweets.",
"## License\nThis dataset is made available under the MIT license, adhering to the licensing terms of the original datasets."
]
| [
"TAGS\n#task_categories-text-classification #size_categories-100K<n<1M #language-English #license-mit #finance #twitter #news #crypto #stocks #tweet #region-us \n",
"# Financial Tweets\nThis dataset is a comprehensive collection of all the tweets from my Discord bot that keeps track of financial influencers on Twitter.\nThe data includes a variety of information, such as the tweet and the price of the tickers in that tweet at the time of posting.\nThis dataset can be used for a variety of tasks, such as sentiment analysis and masked language modelling (MLM).\n\nWe used this dataset for training our FinTwitBERT model.",
"## Overview\nThis datasets includes all the following three datasets:\n- Crypto: URL\n- Stocks (and forex): URL\n- Other (tweets without cash tags): URL",
"## Data Description\nThe dataset comprises tweets related to financial markets, stocks, and economic discussions.",
"### Dataset Fields\nThe data fields are as follows:\n\n* 'timestap': The time the tweet was sent.\n* 'tweet_text': All of the text of the tweet, including quoted tweets (prefixed with '>').\n* 'tweet_url': The URL of the tweet.\n* 'tweet_type': The type of tweet, this can be tweet, retweet, or quote tweet.\n* 'price_of_ticker': The price of the tickers mentioned in USD ($).\n* 'change_of_ticker': The 24h price change of the tickers in USD ($).\n* 'tickers_mentioned': All the tickers that are mentioned in the tweet.\n* 'category': What type of category the tweet has, the suffix '_images' means that the tweet included an image.",
"## Usage\nThis dataset is ideal for training and evaluating machine learning models for sentiment analysis, especially those focused on understanding market trends and investor sentiment. It can be used for academic research, financial market analysis, and developing AI tools for financial institutions.",
"## Acknowledgments\nWe extend our heartfelt gratitude to all the authors of the original tweets.",
"## License\nThis dataset is made available under the MIT license, adhering to the licensing terms of the original datasets."
]
|
081b6a1174196c16abf30927ac712594340637a0 | # Dataset Card for "vi-ovm-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | longhoang06/vi-ovm-dataset | [
"region:us"
]
| 2023-12-07T15:43:26+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "candidate", "dtype": "string"}, {"name": "label", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 341311465, "num_examples": 375000}], "download_size": 32191994, "dataset_size": 341311465}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-07T15:43:33+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "vi-ovm-dataset"
More Information needed | [
"# Dataset Card for \"vi-ovm-dataset\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"vi-ovm-dataset\"\n\nMore Information needed"
]
|
4081687b468eb11f5daeeeb0547f9f49d3ee1613 |
Hugging Face Blog Content.. | chenglu/hf-blogs | [
"task_categories:text-classification",
"size_categories:n<1K",
"language:en",
"license:apache-2.0",
"nlp",
"region:us"
]
| 2023-12-07T16:26:02+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["text-classification"], "pretty_name": "Hugging Face Blog Content", "tags": ["nlp"]} | 2023-12-08T01:40:15+00:00 | []
| [
"en"
]
| TAGS
#task_categories-text-classification #size_categories-n<1K #language-English #license-apache-2.0 #nlp #region-us
|
Hugging Face Blog Content.. | []
| [
"TAGS\n#task_categories-text-classification #size_categories-n<1K #language-English #license-apache-2.0 #nlp #region-us \n"
]
|
78151846b7a7efa5c83fdab6da736376024e1042 | # Dataset Card for "SpeakerEmbedding0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | KyS/SpeakerEmbedding0 | [
"region:us"
]
| 2023-12-07T16:29:52+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "Speakers", "dtype": "string"}, {"name": "Audio", "struct": [{"name": "array", "sequence": "float64"}, {"name": "path", "dtype": "string"}, {"name": "sampling_rate", "dtype": "int64"}]}], "splits": [{"name": "train", "num_bytes": 29234994, "num_examples": 46}], "download_size": 7249854, "dataset_size": 29234994}} | 2023-12-07T16:29:58+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "SpeakerEmbedding0"
More Information needed | [
"# Dataset Card for \"SpeakerEmbedding0\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"SpeakerEmbedding0\"\n\nMore Information needed"
]
|
343364d7f890ecbc083f7e93e7d677698c0df38a | # Dataset Card for "multitiny_rename"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | CJWeiss/multitiny_rename | [
"region:us"
]
| 2023-12-07T16:56:04+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "input", "sequence": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 784230112, "num_examples": 1202}, {"name": "test", "num_bytes": 118646528, "num_examples": 240}, {"name": "valid", "num_bytes": 116992208, "num_examples": 161}], "download_size": 460324193, "dataset_size": 1019868848}} | 2023-12-07T16:56:44+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "multitiny_rename"
More Information needed | [
"# Dataset Card for \"multitiny_rename\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"multitiny_rename\"\n\nMore Information needed"
]
|
8a7c892d3f2fa87d33f0cf4351d2c881a38f26a4 | This dataset comprises nine chunks (out of ten) from the ```Cerebras/SlimPajama-627B``` dataset, processed into a binary index (bin idx) format.
The first chunk is located at : ```rwkv-x-dev/slimpajama-binidx```.
Due to their large size, each chunk is split into multiple parts for easier handling. To reassemble and decompress these parts, follow these steps:
1. Combine all parts of the desired chunk into a single file:
```cat chunk2_text_document_part_* > chunk2_text_document.tar.xz```
2. Decompress the combined file:
```xz -d chunk2_text_document.tar.xz```
3. Extract the decompressed file:
```tar -xvf chunk2_text_document.tar```
This process successfully reassembles and decompresses the chosen chunk.
```
cat chunk2_text_document_part_* > chunk2_text_document.tar.xz
xz -d chunk2_text_document.tar.xz
tar -xvf chunk2_text_document.tar
``` | something-else/Slim-binidx | [
"region:us"
]
| 2023-12-07T17:00:54+00:00 | {"pretty_name": "Slim pajama 627B binidx"} | 2023-12-08T13:52:13+00:00 | []
| []
| TAGS
#region-us
| This dataset comprises nine chunks (out of ten) from the dataset, processed into a binary index (bin idx) format.
The first chunk is located at : .
Due to their large size, each chunk is split into multiple parts for easier handling. To reassemble and decompress these parts, follow these steps:
1. Combine all parts of the desired chunk into a single file:
2. Decompress the combined file:
3. Extract the decompressed file:
This process successfully reassembles and decompresses the chosen chunk.
| []
| [
"TAGS\n#region-us \n"
]
|
32debd0ced3432d8e1135135549b9bfcca858e80 | This is information about the dataset | Schandkroete/SLC_Sentiment_Analysis | [
"task_categories:text-classification",
"region:us"
]
| 2023-12-07T17:05:58+00:00 | {"task_categories": ["text-classification"]} | 2023-12-07T17:09:11+00:00 | []
| []
| TAGS
#task_categories-text-classification #region-us
| This is information about the dataset | []
| [
"TAGS\n#task_categories-text-classification #region-us \n"
]
|
399ed90a57485c8fbedca94e9b2f7df33661173d | # Sampled Big Patent Dataset
This is a sampled Trelis/big_patent_sample dataset containing rows of data with descriptions shorter than or equal to 100,000 characters in length.
--- Sampled from Trelis/big_patent_sampled ---
# Sampled big_patent Dataset
This is a sampled big_patent dataset - sampled down for shorter fine-tunings.
The data is sampled with the aim of providing an even distribution across data lengths. The distribution is quite flat up until 1 million characters in length, making the dataset good for training on lengths up to 250,000 tokens.
# Dataset Card for Big Patent
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Big Patent](https://evasharma.github.io/bigpatent/)
- **Repository:**
- **Paper:** [BIGPATENT: A Large-Scale Dataset for Abstractive and Coherent Summarization](https://arxiv.org/abs/1906.03741)
- **Leaderboard:**
- **Point of Contact:** [Lu Wang](mailto:[email protected])
### Dataset Summary
BIGPATENT, consisting of 1.3 million records of U.S. patent documents along with human written abstractive summaries.
Each US patent application is filed under a Cooperative Patent Classification (CPC) code.
There are nine such classification categories:
- a: Human Necessities
- b: Performing Operations; Transporting
- c: Chemistry; Metallurgy
- d: Textiles; Paper
- e: Fixed Constructions
- f: Mechanical Engineering; Lightning; Heating; Weapons; Blasting
- g: Physics
- h: Electricity
- y: General tagging of new or cross-sectional technology
Current defaults are 2.1.2 version (fix update to cased raw strings) and 'all' CPC codes:
```python
from datasets import load_dataset
ds = load_dataset("big_patent") # default is 'all' CPC codes
ds = load_dataset("big_patent", "all") # the same as above
ds = load_dataset("big_patent", "a") # only 'a' CPC codes
ds = load_dataset("big_patent", codes=["a", "b"])
```
To use 1.0.0 version (lower cased tokenized words), pass both parameters `codes` and `version`:
```python
ds = load_dataset("big_patent", codes="all", version="1.0.0")
ds = load_dataset("big_patent", codes="a", version="1.0.0")
ds = load_dataset("big_patent", codes=["a", "b"], version="1.0.0")
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English
## Dataset Structure
### Data Instances
Each instance contains a pair of `description` and `abstract`. `description` is extracted from the Description section of the Patent while `abstract` is extracted from the Abstract section.
```
{
'description': 'FIELD OF THE INVENTION \n [0001] This invention relates to novel calcium phosphate-coated implantable medical devices and processes of making same. The unique calcium-phosphate coated implantable medical devices minimize...',
'abstract': 'This invention relates to novel calcium phosphate-coated implantable medical devices...'
}
```
### Data Fields
- `description`: detailed description of patent.
- `abstract`: Patent abastract.
### Data Splits
| | train | validation | test |
|:----|------------------:|-------------:|-------:|
| all | 1207222 | 67068 | 67072 |
| a | 174134 | 9674 | 9675 |
| b | 161520 | 8973 | 8974 |
| c | 101042 | 5613 | 5614 |
| d | 10164 | 565 | 565 |
| e | 34443 | 1914 | 1914 |
| f | 85568 | 4754 | 4754 |
| g | 258935 | 14385 | 14386 |
| h | 257019 | 14279 | 14279 |
| y | 124397 | 6911 | 6911 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```bibtex
@article{DBLP:journals/corr/abs-1906-03741,
author = {Eva Sharma and
Chen Li and
Lu Wang},
title = {{BIGPATENT:} {A} Large-Scale Dataset for Abstractive and Coherent
Summarization},
journal = {CoRR},
volume = {abs/1906.03741},
year = {2019},
url = {http://arxiv.org/abs/1906.03741},
eprinttype = {arXiv},
eprint = {1906.03741},
timestamp = {Wed, 26 Jun 2019 07:14:58 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1906-03741.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
### Contributions
Thanks to [@mattbui](https://github.com/mattbui) for adding this dataset. | Trelis/big_patent_100k_characters | [
"task_categories:summarization",
"annotations_creators:no-annotation",
"language_creators:found",
"multilinguality:monolingual",
"size_categories:n<1k",
"source_datasets:big_patent",
"language:en",
"license:cc-by-4.0",
"patent-summarization",
"arxiv:1906.03741",
"region:us"
]
| 2023-12-07T17:45:15+00:00 | {"annotations_creators": ["no-annotation"], "language_creators": ["found"], "language": ["en"], "license": ["cc-by-4.0"], "multilinguality": ["monolingual"], "size_categories": ["n<1k"], "source_datasets": ["big_patent"], "task_categories": ["summarization"], "task_ids": [], "paperswithcode_id": "bigpatent", "pretty_name": "Big Patent <100k characters", "tags": ["patent-summarization"]} | 2023-12-08T13:42:34+00:00 | [
"1906.03741"
]
| [
"en"
]
| TAGS
#task_categories-summarization #annotations_creators-no-annotation #language_creators-found #multilinguality-monolingual #size_categories-n<1k #source_datasets-big_patent #language-English #license-cc-by-4.0 #patent-summarization #arxiv-1906.03741 #region-us
| Sampled Big Patent Dataset
==========================
This is a sampled Trelis/big\_patent\_sample dataset containing rows of data with descriptions shorter than or equal to 100,000 characters in length.
--- Sampled from Trelis/big\_patent\_sampled ---
Sampled big\_patent Dataset
===========================
This is a sampled big\_patent dataset - sampled down for shorter fine-tunings.
The data is sampled with the aim of providing an even distribution across data lengths. The distribution is quite flat up until 1 million characters in length, making the dataset good for training on lengths up to 250,000 tokens.
Dataset Card for Big Patent
===========================
Table of Contents
-----------------
* Dataset Description
+ Dataset Summary
+ Supported Tasks and Leaderboards
+ Languages
* Dataset Structure
+ Data Instances
+ Data Fields
+ Data Splits
* Dataset Creation
+ Curation Rationale
+ Source Data
+ Annotations
+ Personal and Sensitive Information
* Considerations for Using the Data
+ Social Impact of Dataset
+ Discussion of Biases
+ Other Known Limitations
* Additional Information
+ Dataset Curators
+ Licensing Information
+ Citation Information
+ Contributions
Dataset Description
-------------------
* Homepage: Big Patent
* Repository:
* Paper: BIGPATENT: A Large-Scale Dataset for Abstractive and Coherent Summarization
* Leaderboard:
* Point of Contact: Lu Wang
### Dataset Summary
BIGPATENT, consisting of 1.3 million records of U.S. patent documents along with human written abstractive summaries.
Each US patent application is filed under a Cooperative Patent Classification (CPC) code.
There are nine such classification categories:
* a: Human Necessities
* b: Performing Operations; Transporting
* c: Chemistry; Metallurgy
* d: Textiles; Paper
* e: Fixed Constructions
* f: Mechanical Engineering; Lightning; Heating; Weapons; Blasting
* g: Physics
* h: Electricity
* y: General tagging of new or cross-sectional technology
Current defaults are 2.1.2 version (fix update to cased raw strings) and 'all' CPC codes:
To use 1.0.0 version (lower cased tokenized words), pass both parameters 'codes' and 'version':
### Supported Tasks and Leaderboards
### Languages
English
Dataset Structure
-----------------
### Data Instances
Each instance contains a pair of 'description' and 'abstract'. 'description' is extracted from the Description section of the Patent while 'abstract' is extracted from the Abstract section.
### Data Fields
* 'description': detailed description of patent.
* 'abstract': Patent abastract.
### Data Splits
Dataset Creation
----------------
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
Considerations for Using the Data
---------------------------------
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
Additional Information
----------------------
### Dataset Curators
### Licensing Information
### Contributions
Thanks to @mattbui for adding this dataset.
| [
"### Dataset Summary\n\n\nBIGPATENT, consisting of 1.3 million records of U.S. patent documents along with human written abstractive summaries.\nEach US patent application is filed under a Cooperative Patent Classification (CPC) code.\nThere are nine such classification categories:\n\n\n* a: Human Necessities\n* b: Performing Operations; Transporting\n* c: Chemistry; Metallurgy\n* d: Textiles; Paper\n* e: Fixed Constructions\n* f: Mechanical Engineering; Lightning; Heating; Weapons; Blasting\n* g: Physics\n* h: Electricity\n* y: General tagging of new or cross-sectional technology\n\n\nCurrent defaults are 2.1.2 version (fix update to cased raw strings) and 'all' CPC codes:\n\n\nTo use 1.0.0 version (lower cased tokenized words), pass both parameters 'codes' and 'version':",
"### Supported Tasks and Leaderboards",
"### Languages\n\n\nEnglish\n\n\nDataset Structure\n-----------------",
"### Data Instances\n\n\nEach instance contains a pair of 'description' and 'abstract'. 'description' is extracted from the Description section of the Patent while 'abstract' is extracted from the Abstract section.",
"### Data Fields\n\n\n* 'description': detailed description of patent.\n* 'abstract': Patent abastract.",
"### Data Splits\n\n\n\nDataset Creation\n----------------",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\n\nConsiderations for Using the Data\n---------------------------------",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations\n\n\nAdditional Information\n----------------------",
"### Dataset Curators",
"### Licensing Information",
"### Contributions\n\n\nThanks to @mattbui for adding this dataset."
]
| [
"TAGS\n#task_categories-summarization #annotations_creators-no-annotation #language_creators-found #multilinguality-monolingual #size_categories-n<1k #source_datasets-big_patent #language-English #license-cc-by-4.0 #patent-summarization #arxiv-1906.03741 #region-us \n",
"### Dataset Summary\n\n\nBIGPATENT, consisting of 1.3 million records of U.S. patent documents along with human written abstractive summaries.\nEach US patent application is filed under a Cooperative Patent Classification (CPC) code.\nThere are nine such classification categories:\n\n\n* a: Human Necessities\n* b: Performing Operations; Transporting\n* c: Chemistry; Metallurgy\n* d: Textiles; Paper\n* e: Fixed Constructions\n* f: Mechanical Engineering; Lightning; Heating; Weapons; Blasting\n* g: Physics\n* h: Electricity\n* y: General tagging of new or cross-sectional technology\n\n\nCurrent defaults are 2.1.2 version (fix update to cased raw strings) and 'all' CPC codes:\n\n\nTo use 1.0.0 version (lower cased tokenized words), pass both parameters 'codes' and 'version':",
"### Supported Tasks and Leaderboards",
"### Languages\n\n\nEnglish\n\n\nDataset Structure\n-----------------",
"### Data Instances\n\n\nEach instance contains a pair of 'description' and 'abstract'. 'description' is extracted from the Description section of the Patent while 'abstract' is extracted from the Abstract section.",
"### Data Fields\n\n\n* 'description': detailed description of patent.\n* 'abstract': Patent abastract.",
"### Data Splits\n\n\n\nDataset Creation\n----------------",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\n\nConsiderations for Using the Data\n---------------------------------",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations\n\n\nAdditional Information\n----------------------",
"### Dataset Curators",
"### Licensing Information",
"### Contributions\n\n\nThanks to @mattbui for adding this dataset."
]
|
190a0d9846927b3c693061169fb80d79e23237af | # Dataset Card for "INFO6105-llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | fightfei/INFO6105-llama2 | [
"region:us"
]
| 2023-12-07T17:56:58+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 424644.56925207755, "num_examples": 1429}, {"name": "test", "num_bytes": 4457.430747922438, "num_examples": 15}], "download_size": 60539, "dataset_size": 429102.0}} | 2023-12-07T17:57:38+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "INFO6105-llama2"
More Information needed | [
"# Dataset Card for \"INFO6105-llama2\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"INFO6105-llama2\"\n\nMore Information needed"
]
|
fffefcbc80b29938898a36c7120bb78330247fc1 | # Dataset Card for "llama2-political-guanaco"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | JayChauhan99/llama2-political-guanaco | [
"region:us"
]
| 2023-12-07T19:08:20+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6125821, "num_examples": 4676}], "download_size": 3395419, "dataset_size": 6125821}} | 2023-12-07T19:23:00+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "llama2-political-guanaco"
More Information needed | [
"# Dataset Card for \"llama2-political-guanaco\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"llama2-political-guanaco\"\n\nMore Information needed"
]
|
2b51de943c5a7612be76949f1c7d249910223a69 |
## File Structure
- root/
- data/
- images/
- 1_class1/
- img1.jpg
- img1.txt
- 1_class2/
- img2.png
- img2.txt
- regularization/
- reg1.jpg
- reg1.txt
## Parquet Format
### Fields:
| field name | datatype | description |
|---|---|---|
| name | string | name of file without file extension (ie: img1) |
| class | string | name of class folder without "1_" prefix or "regularization" (ie: class1) |
| image | image | name of image file (ie: img2.png) |
| caption | string | caption loaded from .txt file (ie: contents of img1.txt) | | pptd/kohyass_test | [
"region:us"
]
| 2023-12-07T19:51:59+00:00 | {"dataset_info": {"features": [{"name": "name", "dtype": "string"}, {"name": "class", "dtype": "string"}, {"name": "file_name", "dtype": "image"}, {"name": "caption", "dtype": "string"}]}, "configs": [{"config_name": "default", "data_files": "images.parquet"}, {"config_name": "regularization", "data_files": "regularization.parquet"}]} | 2023-12-08T16:40:31+00:00 | []
| []
| TAGS
#region-us
| File Structure
--------------
* root/
+ data/
- images/
* 1\_class1/
+ URL
+ URL
* 1\_class2/
+ URL
+ URL
- regularization/
* URL
* URL
Parquet Format
--------------
### Fields:
field name: name, datatype: string, description: name of file without file extension (ie: img1)
field name: class, datatype: string, description: name of class folder without "1\_" prefix or "regularization" (ie: class1)
field name: image, datatype: image, description: name of image file (ie: URL)
field name: caption, datatype: string, description: caption loaded from .txt file (ie: contents of URL)
| [
"### Fields:\n\n\nfield name: name, datatype: string, description: name of file without file extension (ie: img1)\nfield name: class, datatype: string, description: name of class folder without \"1\\_\" prefix or \"regularization\" (ie: class1)\nfield name: image, datatype: image, description: name of image file (ie: URL)\nfield name: caption, datatype: string, description: caption loaded from .txt file (ie: contents of URL)"
]
| [
"TAGS\n#region-us \n",
"### Fields:\n\n\nfield name: name, datatype: string, description: name of file without file extension (ie: img1)\nfield name: class, datatype: string, description: name of class folder without \"1\\_\" prefix or \"regularization\" (ie: class1)\nfield name: image, datatype: image, description: name of image file (ie: URL)\nfield name: caption, datatype: string, description: caption loaded from .txt file (ie: contents of URL)"
]
|
aea5f9d4fbefe12ed25109aa902b5bd83b507f69 |
# airoboros-3.0-serbian
<img src="https://cdn-uploads.huggingface.co/production/uploads/617bbeec14572ebe9e6ea83f/6d2AooENp1K6oNN5MUaNS.png" width="300"/>
***This dataset is a translation of the airoboros-3.0 datasets to Serbian Latin.***
**NOTE:**
I used various online translation APIs, so the quality of translations isn't perfect yet. However, I will try to refine them over time with the help of automated scripts and LLMs.
Huge thanks to Jondurbin (@jon_durbin) for creating the original dataset as well as the tools for creating it: [https://twitter.com/jon_durbin](https://twitter.com/jon_durbin).
Original dataset link: [https://huggingface.co/datasets/jondurbin/airoboros-3.0](https://huggingface.co/datasets/jondurbin/airoboros-3.0)
Original dataset card:
## Overview
This dataset builds upon the existing airoboros datasets, offering two significant additions:
* **MathJSON**: Provides solutions to mathematical problems using a JSON format that can be evaluated by dedicated libraries. This helps LLM training by reducing the need for extensive examples.
* **Anon-contributed RP dataset**: Enhances the dataset's multi-turn coherency, leading to more natural and engaging conversations.
Furthermore, this translated version makes the dataset accessible to a wider audience who primarily use Serbian Latin.
## Format
The dataset utilizes the ShareGPT format, ensuring compatibility with existing fine-tuning tools within the OS ecosystem.
## MathJSON
Large language models often struggle with complex mathematical concepts, particularly those involving floating-point operations, trigonometric functions, factorials, and large numbers.
The MathJSON category tackles this challenge by presenting solutions in a readily interpretable JSON format. This allows traditional computational libraries to evaluate the solutions, improving training efficiency and reducing the dependence on vast quantities of training data.
The dataset currently includes approximately 4,000 MathJSON samples, serving as a solid foundation for further development and expansion. As fine-tuned models gain a better understanding of this format, the dataset can be easily augmented, enabling them to represent and solve diverse mathematical problems.
For instance:
**Create a MathJSON solution to the following: Calculate the area of a circle with a radius of 17.2456 cm. Include your reasoning.**
Solution as MathJSON:
```
<mathjson>
[
"Multiply",
"Pi",
[
"Power",
17.2456,
2
]
]
</mathjson>
```
The JSON string within the `mathjson` tags can be extracted and evaluated using libraries such as [https://cortexjs.io/compute-engine/](https://cortexjs.io/compute-engine/) or custom implementations like [https://github.com/jondurbin/airoboros/blob/mathjson/airoboros/mathjson.py](https://github.com/jondurbin/airoboros/blob/mathjson/airoboros/mathjson.py).
This approach facilitates efficient training and equips LLM models with the ability to understand and solve mathematical problems effectively.
| draganjovanovich/airoboros-3.0-serbian | [
"task_categories:conversational",
"language:sr",
"license:apache-2.0",
"region:us"
]
| 2023-12-07T19:58:20+00:00 | {"language": ["sr"], "license": "apache-2.0", "task_categories": ["conversational"]} | 2024-01-08T10:30:23+00:00 | []
| [
"sr"
]
| TAGS
#task_categories-conversational #language-Serbian #license-apache-2.0 #region-us
|
# airoboros-3.0-serbian
<img src="URL width="300"/>
*This dataset is a translation of the airoboros-3.0 datasets to Serbian Latin.*
NOTE:
I used various online translation APIs, so the quality of translations isn't perfect yet. However, I will try to refine them over time with the help of automated scripts and LLMs.
Huge thanks to Jondurbin (@jon_durbin) for creating the original dataset as well as the tools for creating it: URL
Original dataset link: URL
Original dataset card:
## Overview
This dataset builds upon the existing airoboros datasets, offering two significant additions:
* MathJSON: Provides solutions to mathematical problems using a JSON format that can be evaluated by dedicated libraries. This helps LLM training by reducing the need for extensive examples.
* Anon-contributed RP dataset: Enhances the dataset's multi-turn coherency, leading to more natural and engaging conversations.
Furthermore, this translated version makes the dataset accessible to a wider audience who primarily use Serbian Latin.
## Format
The dataset utilizes the ShareGPT format, ensuring compatibility with existing fine-tuning tools within the OS ecosystem.
## MathJSON
Large language models often struggle with complex mathematical concepts, particularly those involving floating-point operations, trigonometric functions, factorials, and large numbers.
The MathJSON category tackles this challenge by presenting solutions in a readily interpretable JSON format. This allows traditional computational libraries to evaluate the solutions, improving training efficiency and reducing the dependence on vast quantities of training data.
The dataset currently includes approximately 4,000 MathJSON samples, serving as a solid foundation for further development and expansion. As fine-tuned models gain a better understanding of this format, the dataset can be easily augmented, enabling them to represent and solve diverse mathematical problems.
For instance:
Create a MathJSON solution to the following: Calculate the area of a circle with a radius of 17.2456 cm. Include your reasoning.
Solution as MathJSON:
The JSON string within the 'mathjson' tags can be extracted and evaluated using libraries such as URL or custom implementations like URL
This approach facilitates efficient training and equips LLM models with the ability to understand and solve mathematical problems effectively.
| [
"# airoboros-3.0-serbian\n<img src=\"URL width=\"300\"/> \n\n*This dataset is a translation of the airoboros-3.0 datasets to Serbian Latin.* \n\nNOTE: \nI used various online translation APIs, so the quality of translations isn't perfect yet. However, I will try to refine them over time with the help of automated scripts and LLMs. \n\n\n \nHuge thanks to Jondurbin (@jon_durbin) for creating the original dataset as well as the tools for creating it: URL \n\nOriginal dataset link: URL \n \nOriginal dataset card:",
"## Overview\n\nThis dataset builds upon the existing airoboros datasets, offering two significant additions:\n\n* MathJSON: Provides solutions to mathematical problems using a JSON format that can be evaluated by dedicated libraries. This helps LLM training by reducing the need for extensive examples.\n* Anon-contributed RP dataset: Enhances the dataset's multi-turn coherency, leading to more natural and engaging conversations.\n\nFurthermore, this translated version makes the dataset accessible to a wider audience who primarily use Serbian Latin.",
"## Format\n\nThe dataset utilizes the ShareGPT format, ensuring compatibility with existing fine-tuning tools within the OS ecosystem.",
"## MathJSON\n\nLarge language models often struggle with complex mathematical concepts, particularly those involving floating-point operations, trigonometric functions, factorials, and large numbers.\n\nThe MathJSON category tackles this challenge by presenting solutions in a readily interpretable JSON format. This allows traditional computational libraries to evaluate the solutions, improving training efficiency and reducing the dependence on vast quantities of training data.\n\nThe dataset currently includes approximately 4,000 MathJSON samples, serving as a solid foundation for further development and expansion. As fine-tuned models gain a better understanding of this format, the dataset can be easily augmented, enabling them to represent and solve diverse mathematical problems.\n\nFor instance:\n\nCreate a MathJSON solution to the following: Calculate the area of a circle with a radius of 17.2456 cm. Include your reasoning.\n\nSolution as MathJSON:\n\n\nThe JSON string within the 'mathjson' tags can be extracted and evaluated using libraries such as URL or custom implementations like URL\n\nThis approach facilitates efficient training and equips LLM models with the ability to understand and solve mathematical problems effectively."
]
| [
"TAGS\n#task_categories-conversational #language-Serbian #license-apache-2.0 #region-us \n",
"# airoboros-3.0-serbian\n<img src=\"URL width=\"300\"/> \n\n*This dataset is a translation of the airoboros-3.0 datasets to Serbian Latin.* \n\nNOTE: \nI used various online translation APIs, so the quality of translations isn't perfect yet. However, I will try to refine them over time with the help of automated scripts and LLMs. \n\n\n \nHuge thanks to Jondurbin (@jon_durbin) for creating the original dataset as well as the tools for creating it: URL \n\nOriginal dataset link: URL \n \nOriginal dataset card:",
"## Overview\n\nThis dataset builds upon the existing airoboros datasets, offering two significant additions:\n\n* MathJSON: Provides solutions to mathematical problems using a JSON format that can be evaluated by dedicated libraries. This helps LLM training by reducing the need for extensive examples.\n* Anon-contributed RP dataset: Enhances the dataset's multi-turn coherency, leading to more natural and engaging conversations.\n\nFurthermore, this translated version makes the dataset accessible to a wider audience who primarily use Serbian Latin.",
"## Format\n\nThe dataset utilizes the ShareGPT format, ensuring compatibility with existing fine-tuning tools within the OS ecosystem.",
"## MathJSON\n\nLarge language models often struggle with complex mathematical concepts, particularly those involving floating-point operations, trigonometric functions, factorials, and large numbers.\n\nThe MathJSON category tackles this challenge by presenting solutions in a readily interpretable JSON format. This allows traditional computational libraries to evaluate the solutions, improving training efficiency and reducing the dependence on vast quantities of training data.\n\nThe dataset currently includes approximately 4,000 MathJSON samples, serving as a solid foundation for further development and expansion. As fine-tuned models gain a better understanding of this format, the dataset can be easily augmented, enabling them to represent and solve diverse mathematical problems.\n\nFor instance:\n\nCreate a MathJSON solution to the following: Calculate the area of a circle with a radius of 17.2456 cm. Include your reasoning.\n\nSolution as MathJSON:\n\n\nThe JSON string within the 'mathjson' tags can be extracted and evaluated using libraries such as URL or custom implementations like URL\n\nThis approach facilitates efficient training and equips LLM models with the ability to understand and solve mathematical problems effectively."
]
|
33eea2cbd68b0bf9054ebca7284494263a296404 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | gianlucar/test_contenzioso | [
"region:us"
]
| 2023-12-07T20:00:08+00:00 | {} | 2023-12-07T21:04:17+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
]
|
8afb351765d9e0fb895239f30315641494df2d89 |
# Dataset Card for Evaluation run of LTC-AI-Labs/L2-7b-Hermes-Synthia
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LTC-AI-Labs/L2-7b-Hermes-Synthia
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [LTC-AI-Labs/L2-7b-Hermes-Synthia](https://huggingface.co/LTC-AI-Labs/L2-7b-Hermes-Synthia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Hermes-Synthia",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-07T21:05:32.577722](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Hermes-Synthia/blob/main/results_2023-12-07T21-05-32.577722.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4795053047509253,
"acc_stderr": 0.03443814682805137,
"acc_norm": 0.4845931809792794,
"acc_norm_stderr": 0.03522205048499402,
"mc1": 0.32068543451652387,
"mc1_stderr": 0.0163391703732809,
"mc2": 0.46774826109346956,
"mc2_stderr": 0.01487542286406713
},
"harness|arc:challenge|25": {
"acc": 0.4803754266211604,
"acc_stderr": 0.01460013207594709,
"acc_norm": 0.5102389078498294,
"acc_norm_stderr": 0.014608326906285012
},
"harness|hellaswag|10": {
"acc": 0.5952997410874328,
"acc_stderr": 0.0048983081672118465,
"acc_norm": 0.7911770563632743,
"acc_norm_stderr": 0.0040563690969549464
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04017901275981748,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04017901275981748
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537313,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537313
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02351729433596328,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02351729433596328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.0416345303130286,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.0416345303130286
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5290322580645161,
"acc_stderr": 0.028396016402761,
"acc_norm": 0.5290322580645161,
"acc_norm_stderr": 0.028396016402761
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.03366124489051451,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.03366124489051451
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.03825460278380026,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03825460278380026
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.601010101010101,
"acc_stderr": 0.03488901616852731,
"acc_norm": 0.601010101010101,
"acc_norm_stderr": 0.03488901616852731
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6839378238341969,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.6839378238341969,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.025285585990017838,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.025285585990017838
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.026335739404055803,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.026335739404055803
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6587155963302752,
"acc_stderr": 0.020328612816592442,
"acc_norm": 0.6587155963302752,
"acc_norm_stderr": 0.020328612816592442
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.03441190023482464,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.03441190023482464
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.031052391937584346,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.031052391937584346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.03340867501923324,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.03340867501923324
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4539877300613497,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.4539877300613497,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02934311479809446,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02934311479809446
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6513409961685823,
"acc_stderr": 0.01704124314349096,
"acc_norm": 0.6513409961685823,
"acc_norm_stderr": 0.01704124314349096
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.026882643434022885,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.026882643434022885
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767867,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767867
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5594855305466238,
"acc_stderr": 0.028196400574197422,
"acc_norm": 0.5594855305466238,
"acc_norm_stderr": 0.028196400574197422
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.027701228468542602,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.027701228468542602
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3520208604954368,
"acc_stderr": 0.012198140605353599,
"acc_norm": 0.3520208604954368,
"acc_norm_stderr": 0.012198140605353599
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.030211479609121596,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.030211479609121596
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.020165523313907904,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.020165523313907904
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972745,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972745
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268813,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268813
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32068543451652387,
"mc1_stderr": 0.0163391703732809,
"mc2": 0.46774826109346956,
"mc2_stderr": 0.01487542286406713
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.01224880696937642
},
"harness|gsm8k|5": {
"acc": 0.13949962092494314,
"acc_stderr": 0.00954342668719131
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Hermes-Synthia | [
"region:us"
]
| 2023-12-07T21:08:28+00:00 | {"pretty_name": "Evaluation run of LTC-AI-Labs/L2-7b-Hermes-Synthia", "dataset_summary": "Dataset automatically created during the evaluation run of model [LTC-AI-Labs/L2-7b-Hermes-Synthia](https://huggingface.co/LTC-AI-Labs/L2-7b-Hermes-Synthia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Hermes-Synthia\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-07T21:05:32.577722](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Hermes-Synthia/blob/main/results_2023-12-07T21-05-32.577722.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4795053047509253,\n \"acc_stderr\": 0.03443814682805137,\n \"acc_norm\": 0.4845931809792794,\n \"acc_norm_stderr\": 0.03522205048499402,\n \"mc1\": 0.32068543451652387,\n \"mc1_stderr\": 0.0163391703732809,\n \"mc2\": 0.46774826109346956,\n \"mc2_stderr\": 0.01487542286406713\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4803754266211604,\n \"acc_stderr\": 0.01460013207594709,\n \"acc_norm\": 0.5102389078498294,\n \"acc_norm_stderr\": 0.014608326906285012\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5952997410874328,\n \"acc_stderr\": 0.0048983081672118465,\n \"acc_norm\": 0.7911770563632743,\n \"acc_norm_stderr\": 0.0040563690969549464\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981748,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981748\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537313,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537313\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02351729433596328,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02351729433596328\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.0416345303130286,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.0416345303130286\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5290322580645161,\n \"acc_stderr\": 0.028396016402761,\n \"acc_norm\": 0.5290322580645161,\n \"acc_norm_stderr\": 0.028396016402761\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.03366124489051451,\n \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.03366124489051451\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03825460278380026,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.601010101010101,\n \"acc_stderr\": 0.03488901616852731,\n \"acc_norm\": 0.601010101010101,\n \"acc_norm_stderr\": 0.03488901616852731\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.033553973696861736,\n \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.033553973696861736\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017838,\n \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017838\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6587155963302752,\n \"acc_stderr\": 0.020328612816592442,\n \"acc_norm\": 0.6587155963302752,\n \"acc_norm_stderr\": 0.020328612816592442\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.03441190023482464,\n \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.03441190023482464\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.547085201793722,\n \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.04374928560599738,\n \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.04374928560599738\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4539877300613497,\n \"acc_stderr\": 0.0391170190467718,\n \"acc_norm\": 0.4539877300613497,\n \"acc_norm_stderr\": 0.0391170190467718\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.0482572933735639,\n \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.0482572933735639\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02934311479809446,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02934311479809446\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6513409961685823,\n \"acc_stderr\": 0.01704124314349096,\n \"acc_norm\": 0.6513409961685823,\n \"acc_norm_stderr\": 0.01704124314349096\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.026882643434022885,\n \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.026882643434022885\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.014355911964767867,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.014355911964767867\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5130718954248366,\n \"acc_stderr\": 0.028620130800700246,\n \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.028620130800700246\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5594855305466238,\n \"acc_stderr\": 0.028196400574197422,\n \"acc_norm\": 0.5594855305466238,\n \"acc_norm_stderr\": 0.028196400574197422\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.027701228468542602,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.027701228468542602\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3520208604954368,\n \"acc_stderr\": 0.012198140605353599,\n \"acc_norm\": 0.3520208604954368,\n \"acc_norm_stderr\": 0.012198140605353599\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.030211479609121596,\n \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.030211479609121596\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.020165523313907904,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.020165523313907904\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.04769300568972745,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04769300568972745\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.031987615467631264,\n \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.031987615467631264\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n \"acc_stderr\": 0.03390393042268813,\n \"acc_norm\": 0.6417910447761194,\n \"acc_norm_stderr\": 0.03390393042268813\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32068543451652387,\n \"mc1_stderr\": 0.0163391703732809,\n \"mc2\": 0.46774826109346956,\n \"mc2_stderr\": 0.01487542286406713\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.01224880696937642\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13949962092494314,\n \"acc_stderr\": 0.00954342668719131\n }\n}\n```", "repo_url": "https://huggingface.co/LTC-AI-Labs/L2-7b-Hermes-Synthia", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|arc:challenge|25_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|gsm8k|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hellaswag|10_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-07T21-05-32.577722.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["**/details_harness|winogrande|5_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-07T21-05-32.577722.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_07T21_05_32.577722", "path": ["results_2023-12-07T21-05-32.577722.parquet"]}, {"split": "latest", "path": ["results_2023-12-07T21-05-32.577722.parquet"]}]}]} | 2023-12-07T21:09:14+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of LTC-AI-Labs/L2-7b-Hermes-Synthia
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model LTC-AI-Labs/L2-7b-Hermes-Synthia on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-07T21:05:32.577722(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of LTC-AI-Labs/L2-7b-Hermes-Synthia",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model LTC-AI-Labs/L2-7b-Hermes-Synthia on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-07T21:05:32.577722(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of LTC-AI-Labs/L2-7b-Hermes-Synthia",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model LTC-AI-Labs/L2-7b-Hermes-Synthia on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-07T21:05:32.577722(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
57585fc2a435c782934699ec6f2756a730977cd5 | # Dataset Card for "quirky_sciq_pythia-2.8b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | atmallen/quirky_sciq_pythia-2.8b | [
"region:us"
]
| 2023-12-07T21:37:48+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "label", "dtype": "int64"}, {"name": "log_odds", "dtype": "float32"}, {"name": "statement", "dtype": "string"}, {"name": "character", "dtype": "string"}, {"name": "alice_label", "dtype": "int64"}, {"name": "bob_label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 300268, "num_examples": 1358}, {"name": "validation", "num_bytes": 441946, "num_examples": 2000}, {"name": "test", "num_bytes": 4454, "num_examples": 20}], "download_size": 0, "dataset_size": 746668}} | 2023-12-07T21:42:13+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "quirky_sciq_pythia-2.8b"
More Information needed | [
"# Dataset Card for \"quirky_sciq_pythia-2.8b\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"quirky_sciq_pythia-2.8b\"\n\nMore Information needed"
]
|
ba59cc3b2e378543ba3bec8d2d03299bf18e54eb |
# Dataset Card for Evaluation run of uukuguy/neural-chat-7b-v3-1-dare-0.85
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/neural-chat-7b-v3-1-dare-0.85
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/neural-chat-7b-v3-1-dare-0.85](https://huggingface.co/uukuguy/neural-chat-7b-v3-1-dare-0.85) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__neural-chat-7b-v3-1-dare-0.85",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-07T21:41:35.710987](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__neural-chat-7b-v3-1-dare-0.85/blob/main/results_2023-12-07T21-41-35.710987.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6418688892912601,
"acc_stderr": 0.032096127796783475,
"acc_norm": 0.6471999464850214,
"acc_norm_stderr": 0.03273684834996561,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4490316447301984,
"mc2_stderr": 0.014392988561545562
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.01443413871337998,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.014188277712349812
},
"harness|hellaswag|10": {
"acc": 0.6362278430591516,
"acc_stderr": 0.0048010096576904405,
"acc_norm": 0.8383788090021908,
"acc_norm_stderr": 0.0036735065123709503
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082637,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082637
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010354,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010354
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001505,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001505
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.015201032512520429,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.015201032512520429
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4490316447301984,
"mc2_stderr": 0.014392988561545562
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987727
},
"harness|gsm8k|5": {
"acc": 0.42153146322971946,
"acc_stderr": 0.013601824409483272
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_uukuguy__neural-chat-7b-v3-1-dare-0.85 | [
"region:us"
]
| 2023-12-07T21:44:27+00:00 | {"pretty_name": "Evaluation run of uukuguy/neural-chat-7b-v3-1-dare-0.85", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/neural-chat-7b-v3-1-dare-0.85](https://huggingface.co/uukuguy/neural-chat-7b-v3-1-dare-0.85) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__neural-chat-7b-v3-1-dare-0.85\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-07T21:41:35.710987](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__neural-chat-7b-v3-1-dare-0.85/blob/main/results_2023-12-07T21-41-35.710987.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6418688892912601,\n \"acc_stderr\": 0.032096127796783475,\n \"acc_norm\": 0.6471999464850214,\n \"acc_norm_stderr\": 0.03273684834996561,\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4490316447301984,\n \"mc2_stderr\": 0.014392988561545562\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.01443413871337998,\n \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349812\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6362278430591516,\n \"acc_stderr\": 0.0048010096576904405,\n \"acc_norm\": 0.8383788090021908,\n \"acc_norm_stderr\": 0.0036735065123709503\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082637,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082637\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010354,\n \"acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010354\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n \"acc_stderr\": 0.013927751372001505,\n \"acc_norm\": 0.8135376756066411,\n \"acc_norm_stderr\": 0.013927751372001505\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n \"acc_stderr\": 0.015201032512520429,\n \"acc_norm\": 0.2916201117318436,\n \"acc_norm_stderr\": 0.015201032512520429\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.024051029739912258,\n \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.024051029739912258\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4490316447301984,\n \"mc2_stderr\": 0.014392988561545562\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987727\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42153146322971946,\n \"acc_stderr\": 0.013601824409483272\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/neural-chat-7b-v3-1-dare-0.85", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|arc:challenge|25_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|gsm8k|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hellaswag|10_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-07T21-41-35.710987.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["**/details_harness|winogrande|5_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-07T21-41-35.710987.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_07T21_41_35.710987", "path": ["results_2023-12-07T21-41-35.710987.parquet"]}, {"split": "latest", "path": ["results_2023-12-07T21-41-35.710987.parquet"]}]}]} | 2023-12-07T21:45:13+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of uukuguy/neural-chat-7b-v3-1-dare-0.85
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model uukuguy/neural-chat-7b-v3-1-dare-0.85 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-07T21:41:35.710987(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of uukuguy/neural-chat-7b-v3-1-dare-0.85",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/neural-chat-7b-v3-1-dare-0.85 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-07T21:41:35.710987(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of uukuguy/neural-chat-7b-v3-1-dare-0.85",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/neural-chat-7b-v3-1-dare-0.85 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-07T21:41:35.710987(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
7de631f561e7e4ac66eeae2b796db252ba5281c3 |
# Dataset Card for Evaluation run of Q-bert/MetaMath-Cybertron
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Q-bert/MetaMath-Cybertron
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Q-bert/MetaMath-Cybertron](https://huggingface.co/Q-bert/MetaMath-Cybertron) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Q-bert__MetaMath-Cybertron",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-07T21:43:38.456468](https://huggingface.co/datasets/open-llm-leaderboard/details_Q-bert__MetaMath-Cybertron/blob/main/results_2023-12-07T21-43-38.456468.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.641342115405787,
"acc_stderr": 0.032232870272022124,
"acc_norm": 0.6412913403995665,
"acc_norm_stderr": 0.032896201038175164,
"mc1": 0.408812729498164,
"mc1_stderr": 0.017209952151641734,
"mc2": 0.5770577317207616,
"mc2_stderr": 0.015307336326138697
},
"harness|arc:challenge|25": {
"acc": 0.636518771331058,
"acc_stderr": 0.014056207319068283,
"acc_norm": 0.6646757679180887,
"acc_norm_stderr": 0.013796182947785562
},
"harness|hellaswag|10": {
"acc": 0.6677952599083847,
"acc_stderr": 0.004700413824942566,
"acc_norm": 0.8554072893845848,
"acc_norm_stderr": 0.0035097096477918373
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.041443118108781526,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.041443118108781526
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997685,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997685
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342853,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342853
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590167,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980979,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980979
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922531,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922531
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41564245810055866,
"acc_stderr": 0.016482782187500666,
"acc_norm": 0.41564245810055866,
"acc_norm_stderr": 0.016482782187500666
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537368,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4498044328552803,
"acc_stderr": 0.012705721498565106,
"acc_norm": 0.4498044328552803,
"acc_norm_stderr": 0.012705721498565106
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.019249785691717217,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.019249785691717217
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.408812729498164,
"mc1_stderr": 0.017209952151641734,
"mc2": 0.5770577317207616,
"mc2_stderr": 0.015307336326138697
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626922
},
"harness|gsm8k|5": {
"acc": 0.7050796057619408,
"acc_stderr": 0.012560698010954774
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Q-bert__MetaMath-Cybertron | [
"region:us"
]
| 2023-12-07T21:46:29+00:00 | {"pretty_name": "Evaluation run of Q-bert/MetaMath-Cybertron", "dataset_summary": "Dataset automatically created during the evaluation run of model [Q-bert/MetaMath-Cybertron](https://huggingface.co/Q-bert/MetaMath-Cybertron) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Q-bert__MetaMath-Cybertron\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-07T21:43:38.456468](https://huggingface.co/datasets/open-llm-leaderboard/details_Q-bert__MetaMath-Cybertron/blob/main/results_2023-12-07T21-43-38.456468.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.641342115405787,\n \"acc_stderr\": 0.032232870272022124,\n \"acc_norm\": 0.6412913403995665,\n \"acc_norm_stderr\": 0.032896201038175164,\n \"mc1\": 0.408812729498164,\n \"mc1_stderr\": 0.017209952151641734,\n \"mc2\": 0.5770577317207616,\n \"mc2_stderr\": 0.015307336326138697\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.636518771331058,\n \"acc_stderr\": 0.014056207319068283,\n \"acc_norm\": 0.6646757679180887,\n \"acc_norm_stderr\": 0.013796182947785562\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6677952599083847,\n \"acc_stderr\": 0.004700413824942566,\n \"acc_norm\": 0.8554072893845848,\n \"acc_norm_stderr\": 0.0035097096477918373\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.041443118108781526,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.041443118108781526\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997685,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997685\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342853,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342853\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.013853724170922531,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.013853724170922531\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n \"acc_stderr\": 0.016482782187500666,\n \"acc_norm\": 0.41564245810055866,\n \"acc_norm_stderr\": 0.016482782187500666\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537368,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537368\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4498044328552803,\n \"acc_stderr\": 0.012705721498565106,\n \"acc_norm\": 0.4498044328552803,\n \"acc_norm_stderr\": 0.012705721498565106\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.019249785691717217,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.019249785691717217\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.408812729498164,\n \"mc1_stderr\": 0.017209952151641734,\n \"mc2\": 0.5770577317207616,\n \"mc2_stderr\": 0.015307336326138697\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626922\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7050796057619408,\n \"acc_stderr\": 0.012560698010954774\n }\n}\n```", "repo_url": "https://huggingface.co/Q-bert/MetaMath-Cybertron", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|arc:challenge|25_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|gsm8k|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hellaswag|10_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-07T21-43-38.456468.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["**/details_harness|winogrande|5_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-07T21-43-38.456468.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_07T21_43_38.456468", "path": ["results_2023-12-07T21-43-38.456468.parquet"]}, {"split": "latest", "path": ["results_2023-12-07T21-43-38.456468.parquet"]}]}]} | 2023-12-07T21:47:14+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of Q-bert/MetaMath-Cybertron
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Q-bert/MetaMath-Cybertron on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-07T21:43:38.456468(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Q-bert/MetaMath-Cybertron",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Q-bert/MetaMath-Cybertron on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-07T21:43:38.456468(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Q-bert/MetaMath-Cybertron",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Q-bert/MetaMath-Cybertron on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-07T21:43:38.456468(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
07dc51a62f9d04228a6a678d3154a45d77ca33a6 |
# Dataset Card for Evaluation run of Q-bert/MetaMath-Cybertron-Starling
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Q-bert/MetaMath-Cybertron-Starling
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Q-bert/MetaMath-Cybertron-Starling](https://huggingface.co/Q-bert/MetaMath-Cybertron-Starling) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Q-bert__MetaMath-Cybertron-Starling",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T15:17:27.299396](https://huggingface.co/datasets/open-llm-leaderboard/details_Q-bert__MetaMath-Cybertron-Starling/blob/main/results_2023-12-09T15-17-27.299396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6546823749503232,
"acc_stderr": 0.03199672185414611,
"acc_norm": 0.654898832536071,
"acc_norm_stderr": 0.0326573169076113,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5595409341399591,
"mc2_stderr": 0.015240876129773097
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.014063260279882417,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693252
},
"harness|hellaswag|10": {
"acc": 0.6696873132842064,
"acc_stderr": 0.004693644357202051,
"acc_norm": 0.8625771758613822,
"acc_norm_stderr": 0.0034358953866922537
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971114,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971114
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977927,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4011173184357542,
"acc_stderr": 0.016392221899407082,
"acc_norm": 0.4011173184357542,
"acc_norm_stderr": 0.016392221899407082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079069,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079069
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031215,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031215
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160896,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160896
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5595409341399591,
"mc2_stderr": 0.015240876129773097
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.01095971643524291
},
"harness|gsm8k|5": {
"acc": 0.7149355572403336,
"acc_stderr": 0.012435042334904
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Q-bert__MetaMath-Cybertron-Starling | [
"region:us"
]
| 2023-12-07T22:02:48+00:00 | {"pretty_name": "Evaluation run of Q-bert/MetaMath-Cybertron-Starling", "dataset_summary": "Dataset automatically created during the evaluation run of model [Q-bert/MetaMath-Cybertron-Starling](https://huggingface.co/Q-bert/MetaMath-Cybertron-Starling) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Q-bert__MetaMath-Cybertron-Starling\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T15:17:27.299396](https://huggingface.co/datasets/open-llm-leaderboard/details_Q-bert__MetaMath-Cybertron-Starling/blob/main/results_2023-12-09T15-17-27.299396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6546823749503232,\n \"acc_stderr\": 0.03199672185414611,\n \"acc_norm\": 0.654898832536071,\n \"acc_norm_stderr\": 0.0326573169076113,\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5595409341399591,\n \"mc2_stderr\": 0.015240876129773097\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882417,\n \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693252\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6696873132842064,\n \"acc_stderr\": 0.004693644357202051,\n \"acc_norm\": 0.8625771758613822,\n \"acc_norm_stderr\": 0.0034358953866922537\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971114,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971114\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n \"acc_stderr\": 0.016392221899407082,\n \"acc_norm\": 0.4011173184357542,\n \"acc_norm_stderr\": 0.016392221899407082\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079069,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079069\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031215,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031215\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160896,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160896\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5595409341399591,\n \"mc2_stderr\": 0.015240876129773097\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.01095971643524291\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7149355572403336,\n \"acc_stderr\": 0.012435042334904\n }\n}\n```", "repo_url": "https://huggingface.co/Q-bert/MetaMath-Cybertron-Starling", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|arc:challenge|25_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|gsm8k|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hellaswag|10_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-07T21-59-56.458563.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-17-27.299396.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["**/details_harness|winogrande|5_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["**/details_harness|winogrande|5_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T15-17-27.299396.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_07T21_59_56.458563", "path": ["results_2023-12-07T21-59-56.458563.parquet"]}, {"split": "2023_12_09T15_17_27.299396", "path": ["results_2023-12-09T15-17-27.299396.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T15-17-27.299396.parquet"]}]}]} | 2023-12-09T15:21:03+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of Q-bert/MetaMath-Cybertron-Starling
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Q-bert/MetaMath-Cybertron-Starling on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T15:17:27.299396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Q-bert/MetaMath-Cybertron-Starling",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Q-bert/MetaMath-Cybertron-Starling on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-09T15:17:27.299396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Q-bert/MetaMath-Cybertron-Starling",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Q-bert/MetaMath-Cybertron-Starling on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-09T15:17:27.299396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
2cef9ab12f13e8dc3a7d69ac10fe0071f8e5db91 |
# Dataset Card for Evaluation run of migtissera/Tess-7B-v1.4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Tess-7B-v1.4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Tess-7B-v1.4](https://huggingface.co/migtissera/Tess-7B-v1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-7B-v1.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-07T22:12:44.585661](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-7B-v1.4/blob/main/results_2023-12-07T22-12-44.585661.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6088715659432319,
"acc_stderr": 0.03312437671303067,
"acc_norm": 0.6134610853544302,
"acc_norm_stderr": 0.03378947565640712,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5187917410145858,
"mc2_stderr": 0.015933996625694287
},
"harness|arc:challenge|25": {
"acc": 0.5691126279863481,
"acc_stderr": 0.014471133392642473,
"acc_norm": 0.6040955631399317,
"acc_norm_stderr": 0.014291228393536587
},
"harness|hellaswag|10": {
"acc": 0.6409081856203943,
"acc_stderr": 0.0047875373851530055,
"acc_norm": 0.8287193786098387,
"acc_norm_stderr": 0.0037598401271507057
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.029067220146644823,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.029067220146644823
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.032500536843658404,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.032500536843658404
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404897,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404897
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0436031486007746,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0436031486007746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397443,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878937,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878937
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915434,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794087,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.045218299028335865,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.045218299028335865
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.014214138556913912,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.014214138556913912
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.024883140570071762,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.024883140570071762
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3564245810055866,
"acc_stderr": 0.016018239710513398,
"acc_norm": 0.3564245810055866,
"acc_norm_stderr": 0.016018239710513398
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364805,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364805
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.02628973494595293,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.02628973494595293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.01272978538659856,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.01272978538659856
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.019431775677037313,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.019431775677037313
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417482,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417482
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5187917410145858,
"mc2_stderr": 0.015933996625694287
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.012198489100259764
},
"harness|gsm8k|5": {
"acc": 0.42153146322971946,
"acc_stderr": 0.013601824409483272
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_migtissera__Tess-7B-v1.4 | [
"region:us"
]
| 2023-12-07T22:15:35+00:00 | {"pretty_name": "Evaluation run of migtissera/Tess-7B-v1.4", "dataset_summary": "Dataset automatically created during the evaluation run of model [migtissera/Tess-7B-v1.4](https://huggingface.co/migtissera/Tess-7B-v1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-7B-v1.4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-07T22:12:44.585661](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-7B-v1.4/blob/main/results_2023-12-07T22-12-44.585661.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6088715659432319,\n \"acc_stderr\": 0.03312437671303067,\n \"acc_norm\": 0.6134610853544302,\n \"acc_norm_stderr\": 0.03378947565640712,\n \"mc1\": 0.3623011015911873,\n \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5187917410145858,\n \"mc2_stderr\": 0.015933996625694287\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.014471133392642473,\n \"acc_norm\": 0.6040955631399317,\n \"acc_norm_stderr\": 0.014291228393536587\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6409081856203943,\n \"acc_stderr\": 0.0047875373851530055,\n \"acc_norm\": 0.8287193786098387,\n \"acc_norm_stderr\": 0.0037598401271507057\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644823,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644823\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.032500536843658404,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.032500536843658404\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404897,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404897\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397443,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878937,\n \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878937\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8036697247706422,\n \"acc_stderr\": 0.01703071933915434,\n \"acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.01703071933915434\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794087,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794087\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n \"acc_stderr\": 0.014214138556913912,\n \"acc_norm\": 0.8033205619412516,\n \"acc_norm_stderr\": 0.014214138556913912\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.024883140570071762,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.024883140570071762\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3564245810055866,\n \"acc_stderr\": 0.016018239710513398,\n \"acc_norm\": 0.3564245810055866,\n \"acc_norm_stderr\": 0.016018239710513398\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n \"acc_stderr\": 0.02698147804364805,\n \"acc_norm\": 0.6559485530546624,\n \"acc_norm_stderr\": 0.02698147804364805\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.02628973494595293,\n \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.02628973494595293\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n \"acc_stderr\": 0.01272978538659856,\n \"acc_norm\": 0.4602346805736636,\n \"acc_norm_stderr\": 0.01272978538659856\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.019431775677037313,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.019431775677037313\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417482,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417482\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.7711442786069652,\n \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5187917410145858,\n \"mc2_stderr\": 0.015933996625694287\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.012198489100259764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42153146322971946,\n \"acc_stderr\": 0.013601824409483272\n }\n}\n```", "repo_url": "https://huggingface.co/migtissera/Tess-7B-v1.4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|arc:challenge|25_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|gsm8k|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hellaswag|10_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-07T22-12-44.585661.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["**/details_harness|winogrande|5_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-07T22-12-44.585661.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_07T22_12_44.585661", "path": ["results_2023-12-07T22-12-44.585661.parquet"]}, {"split": "latest", "path": ["results_2023-12-07T22-12-44.585661.parquet"]}]}]} | 2023-12-07T22:16:21+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of migtissera/Tess-7B-v1.4
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model migtissera/Tess-7B-v1.4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-07T22:12:44.585661(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of migtissera/Tess-7B-v1.4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-7B-v1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-07T22:12:44.585661(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of migtissera/Tess-7B-v1.4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-7B-v1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-07T22:12:44.585661(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
d8000777d9830406e46848461ae5993d0044d0cf |
A concatenated instruction-based dataset from the following:
- NbAiLab/norwegian-alpaca
- RuterNorway/Fleurs-Alpaca-EN-NO
- RuterNorway/OpenOrcaNo-15k | tollefj/nor-instruct | [
"region:us"
]
| 2023-12-07T23:05:46+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 33165183, "num_examples": 67714}, {"name": "test", "num_bytes": 324411, "num_examples": 684}], "download_size": 20779304, "dataset_size": 33489594}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2023-12-08T21:19:57+00:00 | []
| []
| TAGS
#region-us
|
A concatenated instruction-based dataset from the following:
- NbAiLab/norwegian-alpaca
- RuterNorway/Fleurs-Alpaca-EN-NO
- RuterNorway/OpenOrcaNo-15k | []
| [
"TAGS\n#region-us \n"
]
|
7f8cc04b1df7671670717a804fe594d2e5ffe758 | # Dataset Card for "ffmperative_augmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | remyxai/ffmperative_augmented | [
"region:us"
]
| 2023-12-07T23:10:29+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 76872282, "num_examples": 52893}], "download_size": 13359933, "dataset_size": 76872282}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-07T23:10:33+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "ffmperative_augmented"
More Information needed | [
"# Dataset Card for \"ffmperative_augmented\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"ffmperative_augmented\"\n\nMore Information needed"
]
|
fe48802073257da857bfeff36b01acd289d4a335 |
# Dataset Card for Evaluation run of ericzzz/falcon-rw-1b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ericzzz/falcon-rw-1b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [ericzzz/falcon-rw-1b-chat](https://huggingface.co/ericzzz/falcon-rw-1b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ericzzz__falcon-rw-1b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-07T23:20:03.444693](https://huggingface.co/datasets/open-llm-leaderboard/details_ericzzz__falcon-rw-1b-chat/blob/main/results_2023-12-07T23-20-03.444693.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25244947238387594,
"acc_stderr": 0.030641727986962762,
"acc_norm": 0.2531449615242294,
"acc_norm_stderr": 0.03139093733494742,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.01525011707915649,
"mc2": 0.3961611921091927,
"mc2_stderr": 0.014575233458509149
},
"harness|arc:challenge|25": {
"acc": 0.33276450511945393,
"acc_stderr": 0.013769863046192302,
"acc_norm": 0.35580204778157,
"acc_norm_stderr": 0.013990571137918758
},
"harness|hellaswag|10": {
"acc": 0.4642501493726349,
"acc_stderr": 0.0049770106704365566,
"acc_norm": 0.6112328221469827,
"acc_norm_stderr": 0.0048647401340436705
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17037037037037037,
"acc_stderr": 0.032477811859955935,
"acc_norm": 0.17037037037037037,
"acc_norm_stderr": 0.032477811859955935
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343601,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343601
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.026055296901152915,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.026055296901152915
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198823,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198823
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.02880998985410298,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.02880998985410298
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.18620689655172415,
"acc_stderr": 0.03243946159004617,
"acc_norm": 0.18620689655172415,
"acc_norm_stderr": 0.03243946159004617
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21957671957671956,
"acc_stderr": 0.021320018599770355,
"acc_norm": 0.21957671957671956,
"acc_norm_stderr": 0.021320018599770355
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1870967741935484,
"acc_stderr": 0.02218571009225226,
"acc_norm": 0.1870967741935484,
"acc_norm_stderr": 0.02218571009225226
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.19704433497536947,
"acc_stderr": 0.02798672466673622,
"acc_norm": 0.19704433497536947,
"acc_norm_stderr": 0.02798672466673622
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20207253886010362,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.20207253886010362,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.02564410863926763,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.02564410863926763
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.026265024608275882,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.026265024608275882
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1981651376146789,
"acc_stderr": 0.017090573804217885,
"acc_norm": 0.1981651376146789,
"acc_norm_stderr": 0.017090573804217885
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.027696910713093936,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.027696910713093936
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3037974683544304,
"acc_stderr": 0.02993669638713861,
"acc_norm": 0.3037974683544304,
"acc_norm_stderr": 0.02993669638713861
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.32286995515695066,
"acc_stderr": 0.03138147637575498,
"acc_norm": 0.32286995515695066,
"acc_norm_stderr": 0.03138147637575498
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591204,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591204
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.20245398773006135,
"acc_stderr": 0.031570650789119026,
"acc_norm": 0.20245398773006135,
"acc_norm_stderr": 0.031570650789119026
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3034188034188034,
"acc_stderr": 0.03011821010694267,
"acc_norm": 0.3034188034188034,
"acc_norm_stderr": 0.03011821010694267
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2567049808429119,
"acc_stderr": 0.015620480263064536,
"acc_norm": 0.2567049808429119,
"acc_norm_stderr": 0.015620480263064536
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210742,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210742
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113596,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113596
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19935691318327975,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.19935691318327975,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24382716049382716,
"acc_stderr": 0.023891879541959614,
"acc_norm": 0.24382716049382716,
"acc_norm_stderr": 0.023891879541959614
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872402,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872402
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2522816166883963,
"acc_stderr": 0.011092789056875236,
"acc_norm": 0.2522816166883963,
"acc_norm_stderr": 0.011092789056875236
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2426470588235294,
"acc_stderr": 0.026040662474201264,
"acc_norm": 0.2426470588235294,
"acc_norm_stderr": 0.026040662474201264
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27346938775510204,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.27346938775510204,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.031524391865554044,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.031524391865554044
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.034843315926805875,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.034843315926805875
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.01525011707915649,
"mc2": 0.3961611921091927,
"mc2_stderr": 0.014575233458509149
},
"harness|winogrande|5": {
"acc": 0.6172059984214681,
"acc_stderr": 0.013660946109442006
},
"harness|gsm8k|5": {
"acc": 0.016679302501895376,
"acc_stderr": 0.003527595888722403
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_ericzzz__falcon-rw-1b-chat | [
"region:us"
]
| 2023-12-07T23:22:10+00:00 | {"pretty_name": "Evaluation run of ericzzz/falcon-rw-1b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [ericzzz/falcon-rw-1b-chat](https://huggingface.co/ericzzz/falcon-rw-1b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ericzzz__falcon-rw-1b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-07T23:20:03.444693](https://huggingface.co/datasets/open-llm-leaderboard/details_ericzzz__falcon-rw-1b-chat/blob/main/results_2023-12-07T23-20-03.444693.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25244947238387594,\n \"acc_stderr\": 0.030641727986962762,\n \"acc_norm\": 0.2531449615242294,\n \"acc_norm_stderr\": 0.03139093733494742,\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.01525011707915649,\n \"mc2\": 0.3961611921091927,\n \"mc2_stderr\": 0.014575233458509149\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.33276450511945393,\n \"acc_stderr\": 0.013769863046192302,\n \"acc_norm\": 0.35580204778157,\n \"acc_norm_stderr\": 0.013990571137918758\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4642501493726349,\n \"acc_stderr\": 0.0049770106704365566,\n \"acc_norm\": 0.6112328221469827,\n \"acc_norm_stderr\": 0.0048647401340436705\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17037037037037037,\n \"acc_stderr\": 0.032477811859955935,\n \"acc_norm\": 0.17037037037037037,\n \"acc_norm_stderr\": 0.032477811859955935\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343601,\n \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343601\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.026055296901152915,\n \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.026055296901152915\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n \"acc_stderr\": 0.030631145539198823,\n \"acc_norm\": 0.2023121387283237,\n \"acc_norm_stderr\": 0.030631145539198823\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.02880998985410298,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.02880998985410298\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.18620689655172415,\n \"acc_stderr\": 0.03243946159004617,\n \"acc_norm\": 0.18620689655172415,\n \"acc_norm_stderr\": 0.03243946159004617\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21957671957671956,\n \"acc_stderr\": 0.021320018599770355,\n \"acc_norm\": 0.21957671957671956,\n \"acc_norm_stderr\": 0.021320018599770355\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1870967741935484,\n \"acc_stderr\": 0.02218571009225226,\n \"acc_norm\": 0.1870967741935484,\n \"acc_norm_stderr\": 0.02218571009225226\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.19704433497536947,\n \"acc_stderr\": 0.02798672466673622,\n \"acc_norm\": 0.19704433497536947,\n \"acc_norm_stderr\": 0.02798672466673622\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.20207253886010362,\n \"acc_stderr\": 0.02897908979429673,\n \"acc_norm\": 0.20207253886010362,\n \"acc_norm_stderr\": 0.02897908979429673\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.02564410863926763,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.02564410863926763\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275882,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275882\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1981651376146789,\n \"acc_stderr\": 0.017090573804217885,\n \"acc_norm\": 0.1981651376146789,\n \"acc_norm_stderr\": 0.017090573804217885\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.20833333333333334,\n \"acc_stderr\": 0.027696910713093936,\n \"acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.027696910713093936\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.3037974683544304,\n \"acc_stderr\": 0.02993669638713861,\n \"acc_norm\": 0.3037974683544304,\n \"acc_norm_stderr\": 0.02993669638713861\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.32286995515695066,\n \"acc_stderr\": 0.03138147637575498,\n \"acc_norm\": 0.32286995515695066,\n \"acc_norm_stderr\": 0.03138147637575498\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591204,\n \"acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591204\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.20245398773006135,\n \"acc_stderr\": 0.031570650789119026,\n \"acc_norm\": 0.20245398773006135,\n \"acc_norm_stderr\": 0.031570650789119026\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3034188034188034,\n \"acc_stderr\": 0.03011821010694267,\n \"acc_norm\": 0.3034188034188034,\n \"acc_norm_stderr\": 0.03011821010694267\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2567049808429119,\n \"acc_stderr\": 0.015620480263064536,\n \"acc_norm\": 0.2567049808429119,\n \"acc_norm_stderr\": 0.015620480263064536\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n \"acc_stderr\": 0.014593620923210742,\n \"acc_norm\": 0.2558659217877095,\n \"acc_norm_stderr\": 0.014593620923210742\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113596,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113596\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.24382716049382716,\n \"acc_stderr\": 0.023891879541959614,\n \"acc_norm\": 0.24382716049382716,\n \"acc_norm_stderr\": 0.023891879541959614\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872402,\n \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872402\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2522816166883963,\n \"acc_stderr\": 0.011092789056875236,\n \"acc_norm\": 0.2522816166883963,\n \"acc_norm_stderr\": 0.011092789056875236\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2426470588235294,\n \"acc_stderr\": 0.026040662474201264,\n \"acc_norm\": 0.2426470588235294,\n \"acc_norm_stderr\": 0.026040662474201264\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.27346938775510204,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.27346938775510204,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n \"acc_stderr\": 0.031524391865554044,\n \"acc_norm\": 0.2736318407960199,\n \"acc_norm_stderr\": 0.031524391865554044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n \"acc_stderr\": 0.034843315926805875,\n \"acc_norm\": 0.27710843373493976,\n \"acc_norm_stderr\": 0.034843315926805875\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.01525011707915649,\n \"mc2\": 0.3961611921091927,\n \"mc2_stderr\": 0.014575233458509149\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6172059984214681,\n \"acc_stderr\": 0.013660946109442006\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.016679302501895376,\n \"acc_stderr\": 0.003527595888722403\n }\n}\n```", "repo_url": "https://huggingface.co/ericzzz/falcon-rw-1b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|arc:challenge|25_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|gsm8k|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hellaswag|10_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-07T23-20-03.444693.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["**/details_harness|winogrande|5_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-07T23-20-03.444693.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_07T23_20_03.444693", "path": ["results_2023-12-07T23-20-03.444693.parquet"]}, {"split": "latest", "path": ["results_2023-12-07T23-20-03.444693.parquet"]}]}]} | 2023-12-07T23:22:57+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of ericzzz/falcon-rw-1b-chat
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model ericzzz/falcon-rw-1b-chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-07T23:20:03.444693(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of ericzzz/falcon-rw-1b-chat",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ericzzz/falcon-rw-1b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-07T23:20:03.444693(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ericzzz/falcon-rw-1b-chat",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ericzzz/falcon-rw-1b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-07T23:20:03.444693(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
6aaef5951bd4ed59dd7a471b37933f84b06d7dcc |
# Dataset Card for Evaluation run of TokenBender/pic_7B_mistral_Full_v0.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TokenBender/pic_7B_mistral_Full_v0.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TokenBender/pic_7B_mistral_Full_v0.2](https://huggingface.co/TokenBender/pic_7B_mistral_Full_v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-07T23:49:49.992744](https://huggingface.co/datasets/open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.2/blob/main/results_2023-12-07T23-49-49.992744.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6463339087864225,
"acc_stderr": 0.03216750582183759,
"acc_norm": 0.6485521430662642,
"acc_norm_stderr": 0.0328130854300184,
"mc1": 0.42717258261933905,
"mc1_stderr": 0.01731683441096393,
"mc2": 0.5920316586461081,
"mc2_stderr": 0.015007888316808188
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.014157022555407161,
"acc_norm": 0.6535836177474402,
"acc_norm_stderr": 0.013905011180063228
},
"harness|hellaswag|10": {
"acc": 0.6397132045409281,
"acc_stderr": 0.004791024004588012,
"acc_norm": 0.8402708623780123,
"acc_norm_stderr": 0.0036560593900501087
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033446,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033446
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.02934457250063433,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.02934457250063433
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374294,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374294
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150877,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265026,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265026
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4100558659217877,
"acc_stderr": 0.016449708209026078,
"acc_norm": 0.4100558659217877,
"acc_norm_stderr": 0.016449708209026078
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4589308996088657,
"acc_stderr": 0.012727084826799804,
"acc_norm": 0.4589308996088657,
"acc_norm_stderr": 0.012727084826799804
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.01931267606578656,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.01931267606578656
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139969,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139969
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.028782108105401705,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.028782108105401705
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42717258261933905,
"mc1_stderr": 0.01731683441096393,
"mc2": 0.5920316586461081,
"mc2_stderr": 0.015007888316808188
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.011350315707462057
},
"harness|gsm8k|5": {
"acc": 0.5974222896133434,
"acc_stderr": 0.013508523063663418
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.2 | [
"region:us"
]
| 2023-12-07T23:52:41+00:00 | {"pretty_name": "Evaluation run of TokenBender/pic_7B_mistral_Full_v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [TokenBender/pic_7B_mistral_Full_v0.2](https://huggingface.co/TokenBender/pic_7B_mistral_Full_v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-07T23:49:49.992744](https://huggingface.co/datasets/open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.2/blob/main/results_2023-12-07T23-49-49.992744.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6463339087864225,\n \"acc_stderr\": 0.03216750582183759,\n \"acc_norm\": 0.6485521430662642,\n \"acc_norm_stderr\": 0.0328130854300184,\n \"mc1\": 0.42717258261933905,\n \"mc1_stderr\": 0.01731683441096393,\n \"mc2\": 0.5920316586461081,\n \"mc2_stderr\": 0.015007888316808188\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407161,\n \"acc_norm\": 0.6535836177474402,\n \"acc_norm_stderr\": 0.013905011180063228\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6397132045409281,\n \"acc_stderr\": 0.004791024004588012,\n \"acc_norm\": 0.8402708623780123,\n \"acc_norm_stderr\": 0.0036560593900501087\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033446,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033446\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.02934457250063433,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.02934457250063433\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374294,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374294\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.03021683101150877,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.03021683101150877\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265026,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265026\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4100558659217877,\n \"acc_stderr\": 0.016449708209026078,\n \"acc_norm\": 0.4100558659217877,\n \"acc_norm_stderr\": 0.016449708209026078\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n \"acc_stderr\": 0.012727084826799804,\n \"acc_norm\": 0.4589308996088657,\n \"acc_norm_stderr\": 0.012727084826799804\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6486928104575164,\n \"acc_stderr\": 0.01931267606578656,\n \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.01931267606578656\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139969,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139969\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.028782108105401705,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.028782108105401705\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42717258261933905,\n \"mc1_stderr\": 0.01731683441096393,\n \"mc2\": 0.5920316586461081,\n \"mc2_stderr\": 0.015007888316808188\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.011350315707462057\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5974222896133434,\n \"acc_stderr\": 0.013508523063663418\n }\n}\n```", "repo_url": "https://huggingface.co/TokenBender/pic_7B_mistral_Full_v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|arc:challenge|25_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|gsm8k|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hellaswag|10_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-07T23-49-49.992744.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["**/details_harness|winogrande|5_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-07T23-49-49.992744.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_07T23_49_49.992744", "path": ["results_2023-12-07T23-49-49.992744.parquet"]}, {"split": "latest", "path": ["results_2023-12-07T23-49-49.992744.parquet"]}]}]} | 2023-12-07T23:53:24+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of TokenBender/pic_7B_mistral_Full_v0.2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TokenBender/pic_7B_mistral_Full_v0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-07T23:49:49.992744(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of TokenBender/pic_7B_mistral_Full_v0.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TokenBender/pic_7B_mistral_Full_v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-07T23:49:49.992744(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TokenBender/pic_7B_mistral_Full_v0.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TokenBender/pic_7B_mistral_Full_v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-07T23:49:49.992744(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
c102232c14d853e6701e850f90129f02cdcb24d2 |
# Dataset Card for Evaluation run of Delcos/Velara
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Delcos/Velara
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Delcos/Velara](https://huggingface.co/Delcos/Velara) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Delcos__Velara",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T00:16:45.141900](https://huggingface.co/datasets/open-llm-leaderboard/details_Delcos__Velara/blob/main/results_2023-12-08T00-16-45.141900.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5941805681088884,
"acc_stderr": 0.03328213036591988,
"acc_norm": 0.5983564094269671,
"acc_norm_stderr": 0.03395331581770101,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024637,
"mc2": 0.44699355725588724,
"mc2_stderr": 0.015255919110214552
},
"harness|arc:challenge|25": {
"acc": 0.5708191126279863,
"acc_stderr": 0.014464085894870653,
"acc_norm": 0.5895904436860068,
"acc_norm_stderr": 0.014374922192642664
},
"harness|hellaswag|10": {
"acc": 0.6477793268273252,
"acc_stderr": 0.004766860907171539,
"acc_norm": 0.8283210515833499,
"acc_norm_stderr": 0.00376330474609875
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.030052580579557845,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.030052580579557845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03899073687357334,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03899073687357334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.025542846817400513,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.025542846817400513
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667768,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667768
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.02504919787604234,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.02504919787604234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.763302752293578,
"acc_stderr": 0.0182240781172991,
"acc_norm": 0.763302752293578,
"acc_norm_stderr": 0.0182240781172991
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808517,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808517
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371151,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371151
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.025574123786546665,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.025574123786546665
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.0142426300705749,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.0142426300705749
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388856,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388856
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.02685882587948853,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.02685882587948853
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829707,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829707
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5919117647058824,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.5919117647058824,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.01962744474841223,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.01962744474841223
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024637,
"mc2": 0.44699355725588724,
"mc2_stderr": 0.015255919110214552
},
"harness|winogrande|5": {
"acc": 0.7379636937647988,
"acc_stderr": 0.012358944431637563
},
"harness|gsm8k|5": {
"acc": 0.40333586050037906,
"acc_stderr": 0.013512654781814687
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Delcos__Velara | [
"region:us"
]
| 2023-12-08T00:19:36+00:00 | {"pretty_name": "Evaluation run of Delcos/Velara", "dataset_summary": "Dataset automatically created during the evaluation run of model [Delcos/Velara](https://huggingface.co/Delcos/Velara) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Delcos__Velara\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T00:16:45.141900](https://huggingface.co/datasets/open-llm-leaderboard/details_Delcos__Velara/blob/main/results_2023-12-08T00-16-45.141900.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5941805681088884,\n \"acc_stderr\": 0.03328213036591988,\n \"acc_norm\": 0.5983564094269671,\n \"acc_norm_stderr\": 0.03395331581770101,\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.015680929364024637,\n \"mc2\": 0.44699355725588724,\n \"mc2_stderr\": 0.015255919110214552\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870653,\n \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642664\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6477793268273252,\n \"acc_stderr\": 0.004766860907171539,\n \"acc_norm\": 0.8283210515833499,\n \"acc_norm_stderr\": 0.00376330474609875\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.030052580579557845,\n \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.030052580579557845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.03899073687357334,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.03899073687357334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400513,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400513\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n \"acc_stderr\": 0.026522709674667768,\n \"acc_norm\": 0.6806451612903226,\n \"acc_norm_stderr\": 0.026522709674667768\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646836,\n \"acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646836\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.02504919787604234,\n \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.02504919787604234\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.763302752293578,\n \"acc_stderr\": 0.0182240781172991,\n \"acc_norm\": 0.763302752293578,\n \"acc_norm_stderr\": 0.0182240781172991\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808517,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808517\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n \"acc_stderr\": 0.014805384478371151,\n \"acc_norm\": 0.7803320561941252,\n \"acc_norm_stderr\": 0.014805384478371151\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546665,\n \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546665\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.0142426300705749,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.0142426300705749\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388856,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388856\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n \"acc_stderr\": 0.02685882587948853,\n \"acc_norm\": 0.662379421221865,\n \"acc_norm_stderr\": 0.02685882587948853\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829707,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829707\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.01962744474841223,\n \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.01962744474841223\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.015680929364024637,\n \"mc2\": 0.44699355725588724,\n \"mc2_stderr\": 0.015255919110214552\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7379636937647988,\n \"acc_stderr\": 0.012358944431637563\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40333586050037906,\n \"acc_stderr\": 0.013512654781814687\n }\n}\n```", "repo_url": "https://huggingface.co/Delcos/Velara", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|arc:challenge|25_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|gsm8k|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hellaswag|10_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T00-16-45.141900.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["**/details_harness|winogrande|5_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T00-16-45.141900.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T00_16_45.141900", "path": ["results_2023-12-08T00-16-45.141900.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T00-16-45.141900.parquet"]}]}]} | 2023-12-08T00:20:21+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of Delcos/Velara
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Delcos/Velara on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-08T00:16:45.141900(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Delcos/Velara",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Delcos/Velara on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T00:16:45.141900(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Delcos/Velara",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Delcos/Velara on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T00:16:45.141900(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
51596a80229bd5f39683270740508f299c8209ba |
# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-441step-flan-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T00:30:10.216270](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-441step-flan-v2/blob/main/results_2023-12-08T00-30-10.216270.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4859119672157952,
"acc_stderr": 0.03429323694683427,
"acc_norm": 0.49074853348450337,
"acc_norm_stderr": 0.03505156425981663,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361002,
"mc2": 0.4502487926207983,
"mc2_stderr": 0.01555198529323624
},
"harness|arc:challenge|25": {
"acc": 0.4906143344709898,
"acc_stderr": 0.014608816322065,
"acc_norm": 0.5213310580204779,
"acc_norm_stderr": 0.014598087973127106
},
"harness|hellaswag|10": {
"acc": 0.5900219079864569,
"acc_stderr": 0.004908241354310213,
"acc_norm": 0.7763393746265684,
"acc_norm_stderr": 0.004158455808204937
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.02345603738398203,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.02345603738398203
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.038095238095238126,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.038095238095238126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5290322580645161,
"acc_stderr": 0.028396016402761,
"acc_norm": 0.5290322580645161,
"acc_norm_stderr": 0.028396016402761
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998573,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998573
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.0381549430868893,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.0381549430868893
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.601010101010101,
"acc_stderr": 0.03488901616852732,
"acc_norm": 0.601010101010101,
"acc_norm_stderr": 0.03488901616852732
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7253886010362695,
"acc_stderr": 0.03221024508041153,
"acc_norm": 0.7253886010362695,
"acc_norm_stderr": 0.03221024508041153
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4256410256410256,
"acc_stderr": 0.02506909438729654,
"acc_norm": 0.4256410256410256,
"acc_norm_stderr": 0.02506909438729654
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712173,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712173
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.671559633027523,
"acc_stderr": 0.02013590279729841,
"acc_norm": 0.671559633027523,
"acc_norm_stderr": 0.02013590279729841
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033086111132364336,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033086111132364336
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6582278481012658,
"acc_stderr": 0.03087453753755362,
"acc_norm": 0.6582278481012658,
"acc_norm_stderr": 0.03087453753755362
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.04356447202665069,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.04356447202665069
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978813,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978813
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.03895632464138937,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.03895632464138937
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02934311479809446,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02934311479809446
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6794380587484036,
"acc_stderr": 0.016688893310803764,
"acc_norm": 0.6794380587484036,
"acc_norm_stderr": 0.016688893310803764
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.02691189868637793,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.02691189868637793
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23016759776536314,
"acc_stderr": 0.014078339253425817,
"acc_norm": 0.23016759776536314,
"acc_norm_stderr": 0.014078339253425817
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.028099240775809553,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.028099240775809553
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.0275860062216077,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.0275860062216077
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.02878222756134724,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.02878222756134724
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3500651890482399,
"acc_stderr": 0.012182552313215174,
"acc_norm": 0.3500651890482399,
"acc_norm_stderr": 0.012182552313215174
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.020226106567657807,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.020226106567657807
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5142857142857142,
"acc_stderr": 0.03199615232806287,
"acc_norm": 0.5142857142857142,
"acc_norm_stderr": 0.03199615232806287
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361002,
"mc2": 0.4502487926207983,
"mc2_stderr": 0.01555198529323624
},
"harness|winogrande|5": {
"acc": 0.7253354380426204,
"acc_stderr": 0.012544516005117187
},
"harness|gsm8k|5": {
"acc": 0.17816527672479152,
"acc_stderr": 0.01054013252754947
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-441step-flan-v2 | [
"region:us"
]
| 2023-12-08T00:33:08+00:00 | {"pretty_name": "Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-441step-flan-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T00:30:10.216270](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-441step-flan-v2/blob/main/results_2023-12-08T00-30-10.216270.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4859119672157952,\n \"acc_stderr\": 0.03429323694683427,\n \"acc_norm\": 0.49074853348450337,\n \"acc_norm_stderr\": 0.03505156425981663,\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487361002,\n \"mc2\": 0.4502487926207983,\n \"mc2_stderr\": 0.01555198529323624\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4906143344709898,\n \"acc_stderr\": 0.014608816322065,\n \"acc_norm\": 0.5213310580204779,\n \"acc_norm_stderr\": 0.014598087973127106\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5900219079864569,\n \"acc_stderr\": 0.004908241354310213,\n \"acc_norm\": 0.7763393746265684,\n \"acc_norm_stderr\": 0.004158455808204937\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.030676096599389184,\n \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.030676096599389184\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.02345603738398203,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.02345603738398203\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.038095238095238126,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.038095238095238126\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5290322580645161,\n \"acc_stderr\": 0.028396016402761,\n \"acc_norm\": 0.5290322580645161,\n \"acc_norm_stderr\": 0.028396016402761\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998573,\n \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998573\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.0381549430868893,\n \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.0381549430868893\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.601010101010101,\n \"acc_stderr\": 0.03488901616852732,\n \"acc_norm\": 0.601010101010101,\n \"acc_norm_stderr\": 0.03488901616852732\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041153,\n \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041153\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4256410256410256,\n \"acc_stderr\": 0.02506909438729654,\n \"acc_norm\": 0.4256410256410256,\n \"acc_norm_stderr\": 0.02506909438729654\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712173,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712173\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.03210479051015776,\n \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.03210479051015776\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.671559633027523,\n \"acc_stderr\": 0.02013590279729841,\n \"acc_norm\": 0.671559633027523,\n \"acc_norm_stderr\": 0.02013590279729841\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3287037037037037,\n \"acc_stderr\": 0.032036140846700596,\n \"acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.032036140846700596\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.033086111132364336,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033086111132364336\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6582278481012658,\n \"acc_stderr\": 0.03087453753755362,\n \"acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.03087453753755362\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.04356447202665069,\n \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.04356447202665069\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.04732332615978813,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.04732332615978813\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138937,\n \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138937\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02934311479809446,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02934311479809446\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6794380587484036,\n \"acc_stderr\": 0.016688893310803764,\n \"acc_norm\": 0.6794380587484036,\n \"acc_norm_stderr\": 0.016688893310803764\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.02691189868637793,\n \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.02691189868637793\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23016759776536314,\n \"acc_stderr\": 0.014078339253425817,\n \"acc_norm\": 0.23016759776536314,\n \"acc_norm_stderr\": 0.014078339253425817\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5130718954248366,\n \"acc_stderr\": 0.028620130800700246,\n \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.028620130800700246\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n \"acc_stderr\": 0.028099240775809553,\n \"acc_norm\": 0.572347266881029,\n \"acc_norm_stderr\": 0.028099240775809553\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.0275860062216077,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.0275860062216077\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36879432624113473,\n \"acc_stderr\": 0.02878222756134724,\n \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.02878222756134724\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3500651890482399,\n \"acc_stderr\": 0.012182552313215174,\n \"acc_norm\": 0.3500651890482399,\n \"acc_norm_stderr\": 0.012182552313215174\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.03033257809455504,\n \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.03033257809455504\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.020226106567657807,\n \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.020226106567657807\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5142857142857142,\n \"acc_stderr\": 0.03199615232806287,\n \"acc_norm\": 0.5142857142857142,\n \"acc_norm_stderr\": 0.03199615232806287\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.034240429246915824,\n \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.034240429246915824\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487361002,\n \"mc2\": 0.4502487926207983,\n \"mc2_stderr\": 0.01555198529323624\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7253354380426204,\n \"acc_stderr\": 0.012544516005117187\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17816527672479152,\n \"acc_stderr\": 0.01054013252754947\n }\n}\n```", "repo_url": "https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|arc:challenge|25_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|gsm8k|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hellaswag|10_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T00-30-10.216270.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["**/details_harness|winogrande|5_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T00-30-10.216270.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T00_30_10.216270", "path": ["results_2023-12-08T00-30-10.216270.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T00-30-10.216270.parquet"]}]}]} | 2023-12-08T00:33:52+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-08T00:30:10.216270(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T00:30:10.216270(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Korabbit/Llama-2-7b-chat-hf-afr-441step-flan-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T00:30:10.216270(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
50b299277cc44f36ae537ed202b4f071d13aa63f |
# Dataset Card for Evaluation run of HyperbeeAI/Tulpar-7b-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/HyperbeeAI/Tulpar-7b-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [HyperbeeAI/Tulpar-7b-v2](https://huggingface.co/HyperbeeAI/Tulpar-7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T00:34:52.885019](https://huggingface.co/datasets/open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v2/blob/main/results_2023-12-08T00-34-52.885019.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6339643856948549,
"acc_stderr": 0.032544251757185845,
"acc_norm": 0.6346927587185602,
"acc_norm_stderr": 0.03320666732717101,
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961578,
"mc2": 0.6364510557327406,
"mc2_stderr": 0.015316470691061521
},
"harness|arc:challenge|25": {
"acc": 0.6450511945392492,
"acc_stderr": 0.013983036904094095,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729124
},
"harness|hellaswag|10": {
"acc": 0.6728739294961164,
"acc_stderr": 0.0046820489066223174,
"acc_norm": 0.8489344752041426,
"acc_norm_stderr": 0.0035738085511685283
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601684,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601684
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396997,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396997
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948482,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342853,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342853
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590172,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590172
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676187,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676187
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.03512385283705049,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.03512385283705049
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165623,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165623
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.01362555690799345,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.01362555690799345
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069716,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.016421670506339185,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.016421670506339185
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.0264930332251459,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.0264930332251459
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729147,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032212,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032212
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.01931267606578655,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.01931267606578655
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961578,
"mc2": 0.6364510557327406,
"mc2_stderr": 0.015316470691061521
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.01135031570746207
},
"harness|gsm8k|5": {
"acc": 0.6360879454131918,
"acc_stderr": 0.013252539227966185
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v2 | [
"region:us"
]
| 2023-12-08T00:37:44+00:00 | {"pretty_name": "Evaluation run of HyperbeeAI/Tulpar-7b-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [HyperbeeAI/Tulpar-7b-v2](https://huggingface.co/HyperbeeAI/Tulpar-7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T00:34:52.885019](https://huggingface.co/datasets/open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v2/blob/main/results_2023-12-08T00-34-52.885019.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6339643856948549,\n \"acc_stderr\": 0.032544251757185845,\n \"acc_norm\": 0.6346927587185602,\n \"acc_norm_stderr\": 0.03320666732717101,\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961578,\n \"mc2\": 0.6364510557327406,\n \"mc2_stderr\": 0.015316470691061521\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6450511945392492,\n \"acc_stderr\": 0.013983036904094095,\n \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729124\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6728739294961164,\n \"acc_stderr\": 0.0046820489066223174,\n \"acc_norm\": 0.8489344752041426,\n \"acc_norm_stderr\": 0.0035738085511685283\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601684,\n \"acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601684\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396997,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396997\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342853,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342853\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590172,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590172\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676187,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676187\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.03512385283705049,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.03512385283705049\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165623,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165623\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.01362555690799345,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.01362555690799345\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069716,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069716\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n \"acc_stderr\": 0.016421670506339185,\n \"acc_norm\": 0.40558659217877097,\n \"acc_norm_stderr\": 0.016421670506339185\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.0264930332251459,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.0264930332251459\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729147,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729147\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n \"acc_stderr\": 0.012719949543032212,\n \"acc_norm\": 0.4556714471968709,\n \"acc_norm_stderr\": 0.012719949543032212\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6486928104575164,\n \"acc_stderr\": 0.01931267606578655,\n \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.01931267606578655\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961578,\n \"mc2\": 0.6364510557327406,\n \"mc2_stderr\": 0.015316470691061521\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.01135031570746207\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6360879454131918,\n \"acc_stderr\": 0.013252539227966185\n }\n}\n```", "repo_url": "https://huggingface.co/HyperbeeAI/Tulpar-7b-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|arc:challenge|25_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|gsm8k|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hellaswag|10_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T00-34-52.885019.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["**/details_harness|winogrande|5_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T00-34-52.885019.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T00_34_52.885019", "path": ["results_2023-12-08T00-34-52.885019.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T00-34-52.885019.parquet"]}]}]} | 2023-12-08T00:38:32+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of HyperbeeAI/Tulpar-7b-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model HyperbeeAI/Tulpar-7b-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-08T00:34:52.885019(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of HyperbeeAI/Tulpar-7b-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model HyperbeeAI/Tulpar-7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T00:34:52.885019(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of HyperbeeAI/Tulpar-7b-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model HyperbeeAI/Tulpar-7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T00:34:52.885019(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
dcd98d17e5ebc667417e11ef5daf7e7467640fe8 |
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2.01
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/perlthoughts/Chupacabra-7B-v2.01
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-7B-v2.01](https://huggingface.co/perlthoughts/Chupacabra-7B-v2.01) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.01",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T15:32:56.516225](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.01/blob/main/results_2023-12-09T15-32-56.516225.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6419099817681183,
"acc_stderr": 0.03236594172955846,
"acc_norm": 0.6436312675927691,
"acc_norm_stderr": 0.033018487416734604,
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961578,
"mc2": 0.6350167676730681,
"mc2_stderr": 0.015081304881172946
},
"harness|arc:challenge|25": {
"acc": 0.6544368600682594,
"acc_stderr": 0.013896938461145685,
"acc_norm": 0.6885665529010239,
"acc_norm_stderr": 0.013532472099850942
},
"harness|hellaswag|10": {
"acc": 0.6758613821947819,
"acc_stderr": 0.004670955399641123,
"acc_norm": 0.8611830312686716,
"acc_norm_stderr": 0.003450488042965006
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.023661296393964273,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.023661296393964273
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374307,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374307
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.01366423099583483,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.01366423099583483
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546837,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546837
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4033519553072626,
"acc_stderr": 0.01640712303219525,
"acc_norm": 0.4033519553072626,
"acc_norm_stderr": 0.01640712303219525
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046623,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046623
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.012728446067669968,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.012728446067669968
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961578,
"mc2": 0.6350167676730681,
"mc2_stderr": 0.015081304881172946
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938282
},
"harness|gsm8k|5": {
"acc": 0.5966641394996209,
"acc_stderr": 0.013512654781814706
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.01 | [
"region:us"
]
| 2023-12-08T00:40:48+00:00 | {"pretty_name": "Evaluation run of perlthoughts/Chupacabra-7B-v2.01", "dataset_summary": "Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-7B-v2.01](https://huggingface.co/perlthoughts/Chupacabra-7B-v2.01) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.01\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T15:32:56.516225](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.01/blob/main/results_2023-12-09T15-32-56.516225.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6419099817681183,\n \"acc_stderr\": 0.03236594172955846,\n \"acc_norm\": 0.6436312675927691,\n \"acc_norm_stderr\": 0.033018487416734604,\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961578,\n \"mc2\": 0.6350167676730681,\n \"mc2_stderr\": 0.015081304881172946\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.013896938461145685,\n \"acc_norm\": 0.6885665529010239,\n \"acc_norm_stderr\": 0.013532472099850942\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6758613821947819,\n \"acc_stderr\": 0.004670955399641123,\n \"acc_norm\": 0.8611830312686716,\n \"acc_norm_stderr\": 0.003450488042965006\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.023661296393964273,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.023661296393964273\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794089,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794089\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.01366423099583483,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.01366423099583483\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546837,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546837\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4033519553072626,\n \"acc_stderr\": 0.01640712303219525,\n \"acc_norm\": 0.4033519553072626,\n \"acc_norm_stderr\": 0.01640712303219525\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046623,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046623\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n \"acc_stderr\": 0.012728446067669968,\n \"acc_norm\": 0.4595827900912647,\n \"acc_norm_stderr\": 0.012728446067669968\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961578,\n \"mc2\": 0.6350167676730681,\n \"mc2_stderr\": 0.015081304881172946\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938282\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5966641394996209,\n \"acc_stderr\": 0.013512654781814706\n }\n}\n```", "repo_url": "https://huggingface.co/perlthoughts/Chupacabra-7B-v2.01", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|arc:challenge|25_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|gsm8k|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hellaswag|10_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T00-37-57.144629.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T15-32-56.516225.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["**/details_harness|winogrande|5_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["**/details_harness|winogrande|5_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T15-32-56.516225.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T00_37_57.144629", "path": ["results_2023-12-08T00-37-57.144629.parquet"]}, {"split": "2023_12_09T15_32_56.516225", "path": ["results_2023-12-09T15-32-56.516225.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T15-32-56.516225.parquet"]}]}]} | 2023-12-09T15:35:53+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2.01
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model perlthoughts/Chupacabra-7B-v2.01 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T15:32:56.516225(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2.01",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model perlthoughts/Chupacabra-7B-v2.01 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-09T15:32:56.516225(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2.01",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model perlthoughts/Chupacabra-7B-v2.01 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-09T15:32:56.516225(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
6e769790b3ce5f5bbb07145d86effe08650a4534 |
# Dataset Card for Evaluation run of Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B](https://huggingface.co/Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__OpenHermes-2.5-neural-chat-7b-v3-2-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T00:44:42.656336](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__OpenHermes-2.5-neural-chat-7b-v3-2-7B/blob/main/results_2023-12-08T00-44-42.656336.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6306409863079997,
"acc_stderr": 0.03241391816831307,
"acc_norm": 0.6325971671659117,
"acc_norm_stderr": 0.033063792745877105,
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6358996422025046,
"mc2_stderr": 0.015244549025156413
},
"harness|arc:challenge|25": {
"acc": 0.6416382252559727,
"acc_stderr": 0.01401288333485986,
"acc_norm": 0.6638225255972696,
"acc_norm_stderr": 0.013804855026205763
},
"harness|hellaswag|10": {
"acc": 0.6559450308703445,
"acc_stderr": 0.004740882120999966,
"acc_norm": 0.8410675164309899,
"acc_norm_stderr": 0.0036486590414936473
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952929,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952929
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.032500536843658404,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.032500536843658404
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.02513809138885111,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.02513809138885111
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659356,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659356
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110936,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110936
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587193,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974333,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974333
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501555,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501555
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676187,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676187
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460364,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460364
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001512,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001512
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39329608938547483,
"acc_stderr": 0.0163372686942701,
"acc_norm": 0.39329608938547483,
"acc_norm_stderr": 0.0163372686942701
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115327,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115327
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294674,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294674
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358981,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358981
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983576,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553704,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623326,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623326
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6358996422025046,
"mc2_stderr": 0.015244549025156413
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345396
},
"harness|gsm8k|5": {
"acc": 0.5678544351781653,
"acc_stderr": 0.013645072137842447
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__OpenHermes-2.5-neural-chat-7b-v3-2-7B | [
"region:us"
]
| 2023-12-08T00:47:34+00:00 | {"pretty_name": "Evaluation run of Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B](https://huggingface.co/Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__OpenHermes-2.5-neural-chat-7b-v3-2-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T00:44:42.656336](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__OpenHermes-2.5-neural-chat-7b-v3-2-7B/blob/main/results_2023-12-08T00-44-42.656336.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6306409863079997,\n \"acc_stderr\": 0.03241391816831307,\n \"acc_norm\": 0.6325971671659117,\n \"acc_norm_stderr\": 0.033063792745877105,\n \"mc1\": 0.4614443084455324,\n \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6358996422025046,\n \"mc2_stderr\": 0.015244549025156413\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6416382252559727,\n \"acc_stderr\": 0.01401288333485986,\n \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205763\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6559450308703445,\n \"acc_stderr\": 0.004740882120999966,\n \"acc_norm\": 0.8410675164309899,\n \"acc_norm_stderr\": 0.0036486590414936473\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.032500536843658404,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.032500536843658404\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.02513809138885111,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.02513809138885111\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659356,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659356\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110936,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110936\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974333,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974333\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501555,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501555\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676187,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676187\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460364,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460364\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n \"acc_stderr\": 0.013927751372001512,\n \"acc_norm\": 0.8135376756066411,\n \"acc_norm_stderr\": 0.013927751372001512\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39329608938547483,\n \"acc_stderr\": 0.0163372686942701,\n \"acc_norm\": 0.39329608938547483,\n \"acc_norm_stderr\": 0.0163372686942701\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.02623696588115327,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.02623696588115327\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294674,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294674\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n \"acc_stderr\": 0.012713845972358981,\n \"acc_norm\": 0.4530638852672751,\n \"acc_norm_stderr\": 0.012713845972358981\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983576,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983576\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553704,\n \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553704\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623326,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623326\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4614443084455324,\n \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6358996422025046,\n \"mc2_stderr\": 0.015244549025156413\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345396\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5678544351781653,\n \"acc_stderr\": 0.013645072137842447\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|arc:challenge|25_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|gsm8k|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hellaswag|10_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T00-44-42.656336.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["**/details_harness|winogrande|5_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T00-44-42.656336.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T00_44_42.656336", "path": ["results_2023-12-08T00-44-42.656336.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T00-44-42.656336.parquet"]}]}]} | 2023-12-08T00:48:21+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-08T00:44:42.656336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T00:44:42.656336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-2-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T00:44:42.656336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
ebc51cf086fb3e411cc8d10e2f43903919c4081c |
# Dataset Card for Evaluation run of TheBloke/Poro-34B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Poro-34B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Poro-34B-GPTQ](https://huggingface.co/TheBloke/Poro-34B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Poro-34B-GPTQ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T00:50:01.441893](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Poro-34B-GPTQ/blob/main/results_2023-12-08T00-50-01.441893.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.33180686611984367,
"acc_stderr": 0.033120719430084274,
"acc_norm": 0.334154536073911,
"acc_norm_stderr": 0.03391850157363058,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041843,
"mc2": 0.383670325628089,
"mc2_stderr": 0.01409469586210006
},
"harness|arc:challenge|25": {
"acc": 0.4206484641638225,
"acc_stderr": 0.014426211252508406,
"acc_norm": 0.47013651877133106,
"acc_norm_stderr": 0.014585305840007102
},
"harness|hellaswag|10": {
"acc": 0.5477992431786497,
"acc_stderr": 0.004966928094797573,
"acc_norm": 0.7375024895439155,
"acc_norm_stderr": 0.004390923353200555
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3471698113207547,
"acc_stderr": 0.029300101705549655,
"acc_norm": 0.3471698113207547,
"acc_norm_stderr": 0.029300101705549655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.038009680605548574,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.038009680605548574
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3179190751445087,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.3179190751445087,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3021276595744681,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.3021276595744681,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336937,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336937
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.35172413793103446,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.35172413793103446,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020534,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020534
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27741935483870966,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.27741935483870966,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.031447125816782426,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.031447125816782426
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206824,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206824
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.37575757575757573,
"acc_stderr": 0.03781887353205982,
"acc_norm": 0.37575757575757573,
"acc_norm_stderr": 0.03781887353205982
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03191178226713548,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03191178226713548
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.034801756684660366,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.034801756684660366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.27692307692307694,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.27692307692307694,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29831932773109243,
"acc_stderr": 0.029719142876342853,
"acc_norm": 0.29831932773109243,
"acc_norm_stderr": 0.029719142876342853
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3247706422018349,
"acc_stderr": 0.020077729109310327,
"acc_norm": 0.3247706422018349,
"acc_norm_stderr": 0.020077729109310327
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25,
"acc_stderr": 0.029531221160930918,
"acc_norm": 0.25,
"acc_norm_stderr": 0.029531221160930918
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3480392156862745,
"acc_stderr": 0.03343311240488419,
"acc_norm": 0.3480392156862745,
"acc_norm_stderr": 0.03343311240488419
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.38396624472573837,
"acc_stderr": 0.031658678064106674,
"acc_norm": 0.38396624472573837,
"acc_norm_stderr": 0.031658678064106674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4304932735426009,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.4304932735426009,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462202,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462202
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4132231404958678,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.4132231404958678,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.31901840490797545,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.31901840490797545,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.33980582524271846,
"acc_stderr": 0.04689765937278132,
"acc_norm": 0.33980582524271846,
"acc_norm_stderr": 0.04689765937278132
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.42735042735042733,
"acc_stderr": 0.032408473935163266,
"acc_norm": 0.42735042735042733,
"acc_norm_stderr": 0.032408473935163266
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4099616858237548,
"acc_stderr": 0.01758767231233604,
"acc_norm": 0.4099616858237548,
"acc_norm_stderr": 0.01758767231233604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.025305258131879706,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.025305258131879706
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3562091503267974,
"acc_stderr": 0.027420477662629242,
"acc_norm": 0.3562091503267974,
"acc_norm_stderr": 0.027420477662629242
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3504823151125402,
"acc_stderr": 0.027098652621301744,
"acc_norm": 0.3504823151125402,
"acc_norm_stderr": 0.027098652621301744
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.027002521034516475,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.027002521034516475
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2985658409387223,
"acc_stderr": 0.011688060141794217,
"acc_norm": 0.2985658409387223,
"acc_norm_stderr": 0.011688060141794217
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.029722152099280058,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.029722152099280058
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.01877168389352818,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.01877168389352818
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2897959183673469,
"acc_stderr": 0.02904308868330435,
"acc_norm": 0.2897959183673469,
"acc_norm_stderr": 0.02904308868330435
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.35323383084577115,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.35323383084577115,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.036643147772880864,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.036643147772880864
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4269005847953216,
"acc_stderr": 0.03793620616529917,
"acc_norm": 0.4269005847953216,
"acc_norm_stderr": 0.03793620616529917
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041843,
"mc2": 0.383670325628089,
"mc2_stderr": 0.01409469586210006
},
"harness|winogrande|5": {
"acc": 0.7134964483030781,
"acc_stderr": 0.012707030139960381
},
"harness|gsm8k|5": {
"acc": 0.05079605761940864,
"acc_stderr": 0.006048352096878092
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_TheBloke__Poro-34B-GPTQ | [
"region:us"
]
| 2023-12-08T00:52:38+00:00 | {"pretty_name": "Evaluation run of TheBloke/Poro-34B-GPTQ", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Poro-34B-GPTQ](https://huggingface.co/TheBloke/Poro-34B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Poro-34B-GPTQ\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T00:50:01.441893](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Poro-34B-GPTQ/blob/main/results_2023-12-08T00-50-01.441893.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.33180686611984367,\n \"acc_stderr\": 0.033120719430084274,\n \"acc_norm\": 0.334154536073911,\n \"acc_norm_stderr\": 0.03391850157363058,\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.014896277441041843,\n \"mc2\": 0.383670325628089,\n \"mc2_stderr\": 0.01409469586210006\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4206484641638225,\n \"acc_stderr\": 0.014426211252508406,\n \"acc_norm\": 0.47013651877133106,\n \"acc_norm_stderr\": 0.014585305840007102\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5477992431786497,\n \"acc_stderr\": 0.004966928094797573,\n \"acc_norm\": 0.7375024895439155,\n \"acc_norm_stderr\": 0.004390923353200555\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3471698113207547,\n \"acc_stderr\": 0.029300101705549655,\n \"acc_norm\": 0.3471698113207547,\n \"acc_norm_stderr\": 0.029300101705549655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n \"acc_stderr\": 0.038009680605548574,\n \"acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.038009680605548574\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3179190751445087,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.3179190751445087,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3021276595744681,\n \"acc_stderr\": 0.030017554471880557,\n \"acc_norm\": 0.3021276595744681,\n \"acc_norm_stderr\": 0.030017554471880557\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336937,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336937\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.35172413793103446,\n \"acc_stderr\": 0.0397923663749741,\n \"acc_norm\": 0.35172413793103446,\n \"acc_norm_stderr\": 0.0397923663749741\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.035122074123020534,\n \"acc_norm\": 0.19047619047619047,\n \"acc_norm_stderr\": 0.035122074123020534\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27741935483870966,\n \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.27741935483870966,\n \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782426,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782426\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206824,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206824\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.37575757575757573,\n \"acc_stderr\": 0.03781887353205982,\n \"acc_norm\": 0.37575757575757573,\n \"acc_norm_stderr\": 0.03781887353205982\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.03191178226713548,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03191178226713548\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.034801756684660366,\n \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.034801756684660366\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.27692307692307694,\n \"acc_stderr\": 0.022688042352424994,\n \"acc_norm\": 0.27692307692307694,\n \"acc_norm_stderr\": 0.022688042352424994\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.29831932773109243,\n \"acc_stderr\": 0.029719142876342853,\n \"acc_norm\": 0.29831932773109243,\n \"acc_norm_stderr\": 0.029719142876342853\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3247706422018349,\n \"acc_stderr\": 0.020077729109310327,\n \"acc_norm\": 0.3247706422018349,\n \"acc_norm_stderr\": 0.020077729109310327\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.029531221160930918,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.029531221160930918\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.3480392156862745,\n \"acc_stderr\": 0.03343311240488419,\n \"acc_norm\": 0.3480392156862745,\n \"acc_norm_stderr\": 0.03343311240488419\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.38396624472573837,\n \"acc_stderr\": 0.031658678064106674,\n \"acc_norm\": 0.38396624472573837,\n \"acc_norm_stderr\": 0.031658678064106674\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4304932735426009,\n \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.4304932735426009,\n \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462202,\n \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462202\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4132231404958678,\n \"acc_stderr\": 0.04495087843548408,\n \"acc_norm\": 0.4132231404958678,\n \"acc_norm_stderr\": 0.04495087843548408\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.31901840490797545,\n \"acc_stderr\": 0.03661997551073836,\n \"acc_norm\": 0.31901840490797545,\n \"acc_norm_stderr\": 0.03661997551073836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.33980582524271846,\n \"acc_stderr\": 0.04689765937278132,\n \"acc_norm\": 0.33980582524271846,\n \"acc_norm_stderr\": 0.04689765937278132\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.42735042735042733,\n \"acc_stderr\": 0.032408473935163266,\n \"acc_norm\": 0.42735042735042733,\n \"acc_norm_stderr\": 0.032408473935163266\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4099616858237548,\n \"acc_stderr\": 0.01758767231233604,\n \"acc_norm\": 0.4099616858237548,\n \"acc_norm_stderr\": 0.01758767231233604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.32947976878612717,\n \"acc_stderr\": 0.025305258131879706,\n \"acc_norm\": 0.32947976878612717,\n \"acc_norm_stderr\": 0.025305258131879706\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3562091503267974,\n \"acc_stderr\": 0.027420477662629242,\n \"acc_norm\": 0.3562091503267974,\n \"acc_norm_stderr\": 0.027420477662629242\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3504823151125402,\n \"acc_stderr\": 0.027098652621301744,\n \"acc_norm\": 0.3504823151125402,\n \"acc_norm_stderr\": 0.027098652621301744\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.37962962962962965,\n \"acc_stderr\": 0.027002521034516475,\n \"acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.027002521034516475\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2985658409387223,\n \"acc_stderr\": 0.011688060141794217,\n \"acc_norm\": 0.2985658409387223,\n \"acc_norm_stderr\": 0.011688060141794217\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.029722152099280058,\n \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.029722152099280058\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.01877168389352818,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.01877168389352818\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.4090909090909091,\n \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2897959183673469,\n \"acc_stderr\": 0.02904308868330435,\n \"acc_norm\": 0.2897959183673469,\n \"acc_norm_stderr\": 0.02904308868330435\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.35323383084577115,\n \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.35323383084577115,\n \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n \"acc_stderr\": 0.036643147772880864,\n \"acc_norm\": 0.3313253012048193,\n \"acc_norm_stderr\": 0.036643147772880864\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.4269005847953216,\n \"acc_stderr\": 0.03793620616529917,\n \"acc_norm\": 0.4269005847953216,\n \"acc_norm_stderr\": 0.03793620616529917\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.014896277441041843,\n \"mc2\": 0.383670325628089,\n \"mc2_stderr\": 0.01409469586210006\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7134964483030781,\n \"acc_stderr\": 0.012707030139960381\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05079605761940864,\n \"acc_stderr\": 0.006048352096878092\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Poro-34B-GPTQ", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|arc:challenge|25_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|gsm8k|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hellaswag|10_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T00-50-01.441893.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["**/details_harness|winogrande|5_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T00-50-01.441893.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T00_50_01.441893", "path": ["results_2023-12-08T00-50-01.441893.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T00-50-01.441893.parquet"]}]}]} | 2023-12-08T00:53:25+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/Poro-34B-GPTQ
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/Poro-34B-GPTQ on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-08T00:50:01.441893(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of TheBloke/Poro-34B-GPTQ",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Poro-34B-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T00:50:01.441893(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/Poro-34B-GPTQ",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Poro-34B-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T00:50:01.441893(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
170a6537832ff7ba8f3488f25bda108418930b12 | # Dataset Card for "contextual-new-ontology-two-tier-v11"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | magnifi/contextual-new-ontology-two-tier-v11 | [
"region:us"
]
| 2023-12-08T00:53:25+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "user_text", "dtype": "string"}, {"name": "contextual", "dtype": "bool"}, {"name": "true_intent", "dtype": "string"}, {"name": "completion", "dtype": "string"}, {"name": "chat_history", "dtype": "string"}, {"name": "message_id", "dtype": "string"}, {"name": "chat_history_message_ids", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1022198, "num_examples": 5781}, {"name": "validation", "num_bytes": 332831, "num_examples": 1968}], "download_size": 239760, "dataset_size": 1355029}} | 2023-12-08T00:53:26+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "contextual-new-ontology-two-tier-v11"
More Information needed | [
"# Dataset Card for \"contextual-new-ontology-two-tier-v11\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"contextual-new-ontology-two-tier-v11\"\n\nMore Information needed"
]
|
e2554ca81ce75d156a3073ad4087ed3136cd38e0 | # Dataset Card for "fm-updates-falcon-7b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | coastalcph/fm-updates-falcon-7b | [
"region:us"
]
| 2023-12-08T00:54:43+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "query", "struct": [{"name": "label", "dtype": "string"}, {"name": "objects", "list": [{"name": "aliases", "sequence": "string"}, {"name": "label", "dtype": "string"}, {"name": "qid", "dtype": "string"}]}, {"name": "qid", "dtype": "string"}, {"name": "rel_id", "dtype": "string"}, {"name": "relation", "dtype": "string"}]}, {"name": "prediction", "struct": [{"name": "predictions", "list": [{"name": "answer", "dtype": "string"}, {"name": "first_token_probability", "dtype": "float64"}, {"name": "per_token_probability", "sequence": "float64"}, {"name": "perplexity", "dtype": "float64"}]}, {"name": "query", "dtype": "string"}]}, {"name": "f1", "dtype": "float64"}, {"name": "relation", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "original_answer", "dtype": "string"}, {"name": "updates", "sequence": "string"}], "splits": [{"name": "test", "num_bytes": 386289.64365440264, "num_examples": 408}], "download_size": 292243, "dataset_size": 386289.64365440264}} | 2023-12-08T00:54:46+00:00 | []
| []
| TAGS
#region-us
| # Dataset Card for "fm-updates-falcon-7b"
More Information needed | [
"# Dataset Card for \"fm-updates-falcon-7b\"\n\nMore Information needed"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for \"fm-updates-falcon-7b\"\n\nMore Information needed"
]
|
f93b7a0c894e6e02edb7f0ae17d02664c709c245 |
# Dataset Card for Evaluation run of TokenBender/pic_7B_mistral_Full_v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TokenBender/pic_7B_mistral_Full_v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TokenBender/pic_7B_mistral_Full_v0.1](https://huggingface.co/TokenBender/pic_7B_mistral_Full_v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T01:00:48.190749](https://huggingface.co/datasets/open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.1/blob/main/results_2023-12-08T01-00-48.190749.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6335554932812488,
"acc_stderr": 0.03234608898724019,
"acc_norm": 0.6365587293846601,
"acc_norm_stderr": 0.03299054248415427,
"mc1": 0.379436964504284,
"mc1_stderr": 0.016987039266142978,
"mc2": 0.5451115248499341,
"mc2_stderr": 0.015141183727073078
},
"harness|arc:challenge|25": {
"acc": 0.6194539249146758,
"acc_stderr": 0.014188277712349812,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175452
},
"harness|hellaswag|10": {
"acc": 0.6406094403505278,
"acc_stderr": 0.004788412062375695,
"acc_norm": 0.8369846644094802,
"acc_norm_stderr": 0.0036862475593618534
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155236,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155236
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.0291265228345868,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.0291265228345868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834838,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834838
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.025131000233647886,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.025131000233647886
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.01596103667523096,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.01596103667523096
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868062,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868062
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358983,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.379436964504284,
"mc1_stderr": 0.016987039266142978,
"mc2": 0.5451115248499341,
"mc2_stderr": 0.015141183727073078
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643416
},
"harness|gsm8k|5": {
"acc": 0.5269143290371494,
"acc_stderr": 0.013752517189717447
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.1 | [
"region:us"
]
| 2023-12-08T01:03:37+00:00 | {"pretty_name": "Evaluation run of TokenBender/pic_7B_mistral_Full_v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [TokenBender/pic_7B_mistral_Full_v0.1](https://huggingface.co/TokenBender/pic_7B_mistral_Full_v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T01:00:48.190749](https://huggingface.co/datasets/open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.1/blob/main/results_2023-12-08T01-00-48.190749.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6335554932812488,\n \"acc_stderr\": 0.03234608898724019,\n \"acc_norm\": 0.6365587293846601,\n \"acc_norm_stderr\": 0.03299054248415427,\n \"mc1\": 0.379436964504284,\n \"mc1_stderr\": 0.016987039266142978,\n \"mc2\": 0.5451115248499341,\n \"mc2_stderr\": 0.015141183727073078\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6194539249146758,\n \"acc_stderr\": 0.014188277712349812,\n \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6406094403505278,\n \"acc_stderr\": 0.004788412062375695,\n \"acc_norm\": 0.8369846644094802,\n \"acc_norm_stderr\": 0.0036862475593618534\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155236,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155236\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.0291265228345868,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.0291265228345868\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834838,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834838\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647886,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647886\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n \"acc_stderr\": 0.01596103667523096,\n \"acc_norm\": 0.35083798882681566,\n \"acc_norm_stderr\": 0.01596103667523096\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868062,\n \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868062\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766006,\n \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766006\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n \"acc_stderr\": 0.012713845972358983,\n \"acc_norm\": 0.4530638852672751,\n \"acc_norm_stderr\": 0.012713845972358983\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.379436964504284,\n \"mc1_stderr\": 0.016987039266142978,\n \"mc2\": 0.5451115248499341,\n \"mc2_stderr\": 0.015141183727073078\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643416\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5269143290371494,\n \"acc_stderr\": 0.013752517189717447\n }\n}\n```", "repo_url": "https://huggingface.co/TokenBender/pic_7B_mistral_Full_v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|arc:challenge|25_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|gsm8k|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hellaswag|10_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T01-00-48.190749.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["**/details_harness|winogrande|5_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T01-00-48.190749.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T01_00_48.190749", "path": ["results_2023-12-08T01-00-48.190749.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T01-00-48.190749.parquet"]}]}]} | 2023-12-08T01:04:22+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of TokenBender/pic_7B_mistral_Full_v0.1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TokenBender/pic_7B_mistral_Full_v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-08T01:00:48.190749(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of TokenBender/pic_7B_mistral_Full_v0.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TokenBender/pic_7B_mistral_Full_v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T01:00:48.190749(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TokenBender/pic_7B_mistral_Full_v0.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TokenBender/pic_7B_mistral_Full_v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T01:00:48.190749(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
08efb2d1cfc70fdfe1c69459d63e9e0160a5f59e |
# Dataset Card for Evaluation run of Biomimicry-AI/ANIMA-Nectar-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Biomimicry-AI/ANIMA-Nectar-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Biomimicry-AI/ANIMA-Nectar-v2](https://huggingface.co/Biomimicry-AI/ANIMA-Nectar-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Biomimicry-AI__ANIMA-Nectar-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T01:10:29.378715](https://huggingface.co/datasets/open-llm-leaderboard/details_Biomimicry-AI__ANIMA-Nectar-v2/blob/main/results_2023-12-08T01-10-29.378715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5370126868865708,
"acc_stderr": 0.03419039205847925,
"acc_norm": 0.5457430578340143,
"acc_norm_stderr": 0.03502116674648576,
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347036,
"mc2": 0.49035413533539823,
"mc2_stderr": 0.014716498676044533
},
"harness|arc:challenge|25": {
"acc": 0.4974402730375427,
"acc_stderr": 0.014611199329843784,
"acc_norm": 0.5324232081911263,
"acc_norm_stderr": 0.014580637569995416
},
"harness|hellaswag|10": {
"acc": 0.5637323242381995,
"acc_stderr": 0.004949080334816018,
"acc_norm": 0.7662816172077276,
"acc_norm_stderr": 0.004223302177263008
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.041711158581816184,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.041711158581816184
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851102,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851102
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.027327548447957543,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.027327548447957543
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.034912078574865175,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.034912078574865175
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4564102564102564,
"acc_stderr": 0.02525448542479961,
"acc_norm": 0.4564102564102564,
"acc_norm_stderr": 0.02525448542479961
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.02742001935094527,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.02742001935094527
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.49159663865546216,
"acc_stderr": 0.03247390276569669,
"acc_norm": 0.49159663865546216,
"acc_norm_stderr": 0.03247390276569669
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7302752293577982,
"acc_stderr": 0.019028486711115438,
"acc_norm": 0.7302752293577982,
"acc_norm_stderr": 0.019028486711115438
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.03354092437591518,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.03354092437591518
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.03038193194999041,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.03038193194999041
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.722860791826309,
"acc_stderr": 0.016005636294122425,
"acc_norm": 0.722860791826309,
"acc_norm_stderr": 0.016005636294122425
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.026756255129663762,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.026756255129663762
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310267,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310267
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.027809322585774503,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.027809322585774503
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132146,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.028473501272963768,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.028473501272963768
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3767926988265971,
"acc_stderr": 0.012376459593894402,
"acc_norm": 0.3767926988265971,
"acc_norm_stderr": 0.012376459593894402
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5016339869281046,
"acc_stderr": 0.020227726838150117,
"acc_norm": 0.5016339869281046,
"acc_norm_stderr": 0.020227726838150117
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425464,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425464
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.03265819588512698,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.03265819588512698
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347036,
"mc2": 0.49035413533539823,
"mc2_stderr": 0.014716498676044533
},
"harness|winogrande|5": {
"acc": 0.7411207576953434,
"acc_stderr": 0.012310515810993376
},
"harness|gsm8k|5": {
"acc": 0.05534495830174375,
"acc_stderr": 0.006298221796179562
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Biomimicry-AI__ANIMA-Nectar-v2 | [
"region:us"
]
| 2023-12-08T01:13:21+00:00 | {"pretty_name": "Evaluation run of Biomimicry-AI/ANIMA-Nectar-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Biomimicry-AI/ANIMA-Nectar-v2](https://huggingface.co/Biomimicry-AI/ANIMA-Nectar-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Biomimicry-AI__ANIMA-Nectar-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T01:10:29.378715](https://huggingface.co/datasets/open-llm-leaderboard/details_Biomimicry-AI__ANIMA-Nectar-v2/blob/main/results_2023-12-08T01-10-29.378715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5370126868865708,\n \"acc_stderr\": 0.03419039205847925,\n \"acc_norm\": 0.5457430578340143,\n \"acc_norm_stderr\": 0.03502116674648576,\n \"mc1\": 0.3427172582619339,\n \"mc1_stderr\": 0.016614949385347036,\n \"mc2\": 0.49035413533539823,\n \"mc2_stderr\": 0.014716498676044533\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4974402730375427,\n \"acc_stderr\": 0.014611199329843784,\n \"acc_norm\": 0.5324232081911263,\n \"acc_norm_stderr\": 0.014580637569995416\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5637323242381995,\n \"acc_stderr\": 0.004949080334816018,\n \"acc_norm\": 0.7662816172077276,\n \"acc_norm_stderr\": 0.004223302177263008\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n \"acc_stderr\": 0.041711158581816184,\n \"acc_norm\": 0.5347222222222222,\n \"acc_norm_stderr\": 0.041711158581816184\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.03266204299064678,\n \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.03266204299064678\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851102,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851102\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n \"acc_stderr\": 0.027327548447957543,\n \"acc_norm\": 0.6387096774193548,\n \"acc_norm_stderr\": 0.027327548447957543\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.034912078574865175,\n \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.034912078574865175\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4564102564102564,\n \"acc_stderr\": 0.02525448542479961,\n \"acc_norm\": 0.4564102564102564,\n \"acc_norm_stderr\": 0.02525448542479961\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094527,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094527\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.49159663865546216,\n \"acc_stderr\": 0.03247390276569669,\n \"acc_norm\": 0.49159663865546216,\n \"acc_norm_stderr\": 0.03247390276569669\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7302752293577982,\n \"acc_stderr\": 0.019028486711115438,\n \"acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.019028486711115438\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.03354092437591518,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.03354092437591518\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.679324894514768,\n \"acc_stderr\": 0.03038193194999041,\n \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.03038193194999041\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.8162393162393162,\n \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.722860791826309,\n \"acc_stderr\": 0.016005636294122425,\n \"acc_norm\": 0.722860791826309,\n \"acc_norm_stderr\": 0.016005636294122425\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.026756255129663762,\n \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.026756255129663762\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n \"acc_stderr\": 0.016051419760310267,\n \"acc_norm\": 0.35977653631284917,\n \"acc_norm_stderr\": 0.016051419760310267\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n \"acc_stderr\": 0.027809322585774503,\n \"acc_norm\": 0.6012861736334405,\n \"acc_norm_stderr\": 0.027809322585774503\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132146,\n \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132146\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963768,\n \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963768\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3767926988265971,\n \"acc_stderr\": 0.012376459593894402,\n \"acc_norm\": 0.3767926988265971,\n \"acc_norm_stderr\": 0.012376459593894402\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5016339869281046,\n \"acc_stderr\": 0.020227726838150117,\n \"acc_norm\": 0.5016339869281046,\n \"acc_norm_stderr\": 0.020227726838150117\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n \"acc_stderr\": 0.03265819588512698,\n \"acc_norm\": 0.6915422885572139,\n \"acc_norm_stderr\": 0.03265819588512698\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3427172582619339,\n \"mc1_stderr\": 0.016614949385347036,\n \"mc2\": 0.49035413533539823,\n \"mc2_stderr\": 0.014716498676044533\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7411207576953434,\n \"acc_stderr\": 0.012310515810993376\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05534495830174375,\n \"acc_stderr\": 0.006298221796179562\n }\n}\n```", "repo_url": "https://huggingface.co/Biomimicry-AI/ANIMA-Nectar-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|arc:challenge|25_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|gsm8k|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hellaswag|10_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T01-10-29.378715.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["**/details_harness|winogrande|5_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T01-10-29.378715.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T01_10_29.378715", "path": ["results_2023-12-08T01-10-29.378715.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T01-10-29.378715.parquet"]}]}]} | 2023-12-08T01:14:05+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of Biomimicry-AI/ANIMA-Nectar-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Biomimicry-AI/ANIMA-Nectar-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-08T01:10:29.378715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Biomimicry-AI/ANIMA-Nectar-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Biomimicry-AI/ANIMA-Nectar-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T01:10:29.378715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Biomimicry-AI/ANIMA-Nectar-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Biomimicry-AI/ANIMA-Nectar-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T01:10:29.378715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
1913642e2eb65109fc2588eee79db069251cf454 |
# Dataset Card for Evaluation run of beberik/Nyxene-v2-11B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/beberik/Nyxene-v2-11B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [beberik/Nyxene-v2-11B](https://huggingface.co/beberik/Nyxene-v2-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beberik__Nyxene-v2-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T01:12:34.266579](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__Nyxene-v2-11B/blob/main/results_2023-12-08T01-12-34.266579.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6531918542009467,
"acc_stderr": 0.032013202505680724,
"acc_norm": 0.6561995973883771,
"acc_norm_stderr": 0.032650200642534,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.0171158156324182,
"mc2": 0.5561828601419777,
"mc2_stderr": 0.015469854760875708
},
"harness|arc:challenge|25": {
"acc": 0.6459044368600683,
"acc_stderr": 0.01397545412275656,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693247
},
"harness|hellaswag|10": {
"acc": 0.6603266281617207,
"acc_stderr": 0.0047263042251373165,
"acc_norm": 0.8454491137223661,
"acc_norm_stderr": 0.0036073726062951015
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368881,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368881
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970572,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970572
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7310924369747899,
"acc_stderr": 0.02880139219363127,
"acc_norm": 0.7310924369747899,
"acc_norm_stderr": 0.02880139219363127
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508762,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508762
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934725,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934725
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579827,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.01624202883405362,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.01624202883405362
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340856,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340856
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730583,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730583
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02767846864214472,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02767846864214472
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274645,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.0171158156324182,
"mc2": 0.5561828601419777,
"mc2_stderr": 0.015469854760875708
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597214
},
"harness|gsm8k|5": {
"acc": 0.5466262319939348,
"acc_stderr": 0.01371247104951544
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_beberik__Nyxene-v2-11B | [
"region:us"
]
| 2023-12-08T01:15:25+00:00 | {"pretty_name": "Evaluation run of beberik/Nyxene-v2-11B", "dataset_summary": "Dataset automatically created during the evaluation run of model [beberik/Nyxene-v2-11B](https://huggingface.co/beberik/Nyxene-v2-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beberik__Nyxene-v2-11B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T01:12:34.266579](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__Nyxene-v2-11B/blob/main/results_2023-12-08T01-12-34.266579.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6531918542009467,\n \"acc_stderr\": 0.032013202505680724,\n \"acc_norm\": 0.6561995973883771,\n \"acc_norm_stderr\": 0.032650200642534,\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.0171158156324182,\n \"mc2\": 0.5561828601419777,\n \"mc2_stderr\": 0.015469854760875708\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6459044368600683,\n \"acc_stderr\": 0.01397545412275656,\n \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693247\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6603266281617207,\n \"acc_stderr\": 0.0047263042251373165,\n \"acc_norm\": 0.8454491137223661,\n \"acc_norm_stderr\": 0.0036073726062951015\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368881,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368881\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970572,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970572\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7310924369747899,\n \"acc_stderr\": 0.02880139219363127,\n \"acc_norm\": 0.7310924369747899,\n \"acc_norm_stderr\": 0.02880139219363127\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162112,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162112\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.030216831011508762,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.030216831011508762\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934725,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934725\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579827,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579827\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n \"acc_stderr\": 0.01624202883405362,\n \"acc_norm\": 0.38100558659217876,\n \"acc_norm_stderr\": 0.01624202883405362\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340856,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340856\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303055,\n \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303055\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n \"acc_stderr\": 0.012737361318730583,\n \"acc_norm\": 0.4641460234680574,\n \"acc_norm_stderr\": 0.012737361318730583\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02767846864214472,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02767846864214472\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274645,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274645\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.0171158156324182,\n \"mc2\": 0.5561828601419777,\n \"mc2_stderr\": 0.015469854760875708\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597214\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5466262319939348,\n \"acc_stderr\": 0.01371247104951544\n }\n}\n```", "repo_url": "https://huggingface.co/beberik/Nyxene-v2-11B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|arc:challenge|25_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|gsm8k|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hellaswag|10_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T01-12-34.266579.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["**/details_harness|winogrande|5_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T01-12-34.266579.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T01_12_34.266579", "path": ["results_2023-12-08T01-12-34.266579.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T01-12-34.266579.parquet"]}]}]} | 2023-12-08T01:16:12+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of beberik/Nyxene-v2-11B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model beberik/Nyxene-v2-11B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-08T01:12:34.266579(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of beberik/Nyxene-v2-11B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model beberik/Nyxene-v2-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T01:12:34.266579(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of beberik/Nyxene-v2-11B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model beberik/Nyxene-v2-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T01:12:34.266579(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
b34e0035ee5c83ad767deff4c4a3f9bd86bc3c5b |
# Dataset Card for Evaluation run of NurtureAI/neural-chat-11b-v3-2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NurtureAI/neural-chat-11b-v3-2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [NurtureAI/neural-chat-11b-v3-2](https://huggingface.co/NurtureAI/neural-chat-11b-v3-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NurtureAI__neural-chat-11b-v3-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T01:21:45.753346](https://huggingface.co/datasets/open-llm-leaderboard/details_NurtureAI__neural-chat-11b-v3-2/blob/main/results_2023-12-08T01-21-45.753346.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6233413317621571,
"acc_stderr": 0.0328873097315513,
"acc_norm": 0.6277933709363942,
"acc_norm_stderr": 0.0335567202427567,
"mc1": 0.4394124847001224,
"mc1_stderr": 0.01737452048251371,
"mc2": 0.6022413409108424,
"mc2_stderr": 0.015142783614018333
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759095,
"acc_norm": 0.6663822525597269,
"acc_norm_stderr": 0.013778687054176536
},
"harness|hellaswag|10": {
"acc": 0.627365066719777,
"acc_stderr": 0.004825179407757565,
"acc_norm": 0.8211511651065525,
"acc_norm_stderr": 0.003824424844466082
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462826,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976054,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976054
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808507,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808507
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.0384985609879409,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.0384985609879409
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650741,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650741
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597518,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597518
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.014385525076611571,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.014385525076611571
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879713,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879713
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3754189944134078,
"acc_stderr": 0.01619510424846353,
"acc_norm": 0.3754189944134078,
"acc_norm_stderr": 0.01619510424846353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630457,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42242503259452413,
"acc_stderr": 0.012615600475734921,
"acc_norm": 0.42242503259452413,
"acc_norm_stderr": 0.012615600475734921
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4394124847001224,
"mc1_stderr": 0.01737452048251371,
"mc2": 0.6022413409108424,
"mc2_stderr": 0.015142783614018333
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626913
},
"harness|gsm8k|5": {
"acc": 0.42153146322971946,
"acc_stderr": 0.013601824409483272
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_NurtureAI__neural-chat-11b-v3-2 | [
"region:us"
]
| 2023-12-08T01:24:38+00:00 | {"pretty_name": "Evaluation run of NurtureAI/neural-chat-11b-v3-2", "dataset_summary": "Dataset automatically created during the evaluation run of model [NurtureAI/neural-chat-11b-v3-2](https://huggingface.co/NurtureAI/neural-chat-11b-v3-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NurtureAI__neural-chat-11b-v3-2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T01:21:45.753346](https://huggingface.co/datasets/open-llm-leaderboard/details_NurtureAI__neural-chat-11b-v3-2/blob/main/results_2023-12-08T01-21-45.753346.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6233413317621571,\n \"acc_stderr\": 0.0328873097315513,\n \"acc_norm\": 0.6277933709363942,\n \"acc_norm_stderr\": 0.0335567202427567,\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.01737452048251371,\n \"mc2\": 0.6022413409108424,\n \"mc2_stderr\": 0.015142783614018333\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759095,\n \"acc_norm\": 0.6663822525597269,\n \"acc_norm_stderr\": 0.013778687054176536\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.627365066719777,\n \"acc_stderr\": 0.004825179407757565,\n \"acc_norm\": 0.8211511651065525,\n \"acc_norm_stderr\": 0.003824424844466082\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462826,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976054,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976054\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.03362277436608043,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03362277436608043\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808507,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808507\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.0384985609879409,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.0384985609879409\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650741,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650741\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597518,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597518\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n \"acc_stderr\": 0.014385525076611571,\n \"acc_norm\": 0.7969348659003831,\n \"acc_norm_stderr\": 0.014385525076611571\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879713,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879713\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3754189944134078,\n \"acc_stderr\": 0.01619510424846353,\n \"acc_norm\": 0.3754189944134078,\n \"acc_norm_stderr\": 0.01619510424846353\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630457,\n \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630457\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766006,\n \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766006\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n \"acc_stderr\": 0.012615600475734921,\n \"acc_norm\": 0.42242503259452413,\n \"acc_norm_stderr\": 0.012615600475734921\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.01737452048251371,\n \"mc2\": 0.6022413409108424,\n \"mc2_stderr\": 0.015142783614018333\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626913\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42153146322971946,\n \"acc_stderr\": 0.013601824409483272\n }\n}\n```", "repo_url": "https://huggingface.co/NurtureAI/neural-chat-11b-v3-2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|arc:challenge|25_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|gsm8k|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hellaswag|10_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T01-21-45.753346.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["**/details_harness|winogrande|5_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T01-21-45.753346.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T01_21_45.753346", "path": ["results_2023-12-08T01-21-45.753346.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T01-21-45.753346.parquet"]}]}]} | 2023-12-08T01:25:25+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of NurtureAI/neural-chat-11b-v3-2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model NurtureAI/neural-chat-11b-v3-2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-08T01:21:45.753346(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of NurtureAI/neural-chat-11b-v3-2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model NurtureAI/neural-chat-11b-v3-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T01:21:45.753346(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NurtureAI/neural-chat-11b-v3-2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model NurtureAI/neural-chat-11b-v3-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T01:21:45.753346(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
0998cde8f14d0e1caeabc12e21c1f123417a6d6d |
# Dataset Card for Evaluation run of beberik/Nyxene-v1-11B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/beberik/Nyxene-v1-11B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [beberik/Nyxene-v1-11B](https://huggingface.co/beberik/Nyxene-v1-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beberik__Nyxene-v1-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T01:32:44.134734](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__Nyxene-v1-11B/blob/main/results_2023-12-08T01-32-44.134734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6514001005008682,
"acc_stderr": 0.032003872263626075,
"acc_norm": 0.6548783767702248,
"acc_norm_stderr": 0.03263791027558859,
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093897,
"mc2": 0.5727980289823402,
"mc2_stderr": 0.015500934892748477
},
"harness|arc:challenge|25": {
"acc": 0.6476109215017065,
"acc_stderr": 0.01396014260059868,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729119
},
"harness|hellaswag|10": {
"acc": 0.6592312288388767,
"acc_stderr": 0.004729990807895058,
"acc_norm": 0.8452499502091216,
"acc_norm_stderr": 0.003609271000593056
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337145,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337145
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474898,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.01346820161406631,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.01346820161406631
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.024685316867257803,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.024685316867257803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36983240223463687,
"acc_stderr": 0.016145881256056215,
"acc_norm": 0.36983240223463687,
"acc_norm_stderr": 0.016145881256056215
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460852,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460852
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214963,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214963
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.027576468622740546,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.027576468622740546
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093897,
"mc2": 0.5727980289823402,
"mc2_stderr": 0.015500934892748477
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.5208491281273692,
"acc_stderr": 0.013760506094029866
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_beberik__Nyxene-v1-11B | [
"region:us"
]
| 2023-12-08T01:35:37+00:00 | {"pretty_name": "Evaluation run of beberik/Nyxene-v1-11B", "dataset_summary": "Dataset automatically created during the evaluation run of model [beberik/Nyxene-v1-11B](https://huggingface.co/beberik/Nyxene-v1-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beberik__Nyxene-v1-11B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-08T01:32:44.134734](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__Nyxene-v1-11B/blob/main/results_2023-12-08T01-32-44.134734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6514001005008682,\n \"acc_stderr\": 0.032003872263626075,\n \"acc_norm\": 0.6548783767702248,\n \"acc_norm_stderr\": 0.03263791027558859,\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.017193835812093897,\n \"mc2\": 0.5727980289823402,\n \"mc2_stderr\": 0.015500934892748477\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.01396014260059868,\n \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729119\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6592312288388767,\n \"acc_stderr\": 0.004729990807895058,\n \"acc_norm\": 0.8452499502091216,\n \"acc_norm_stderr\": 0.003609271000593056\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337145,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337145\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474898,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474898\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.01346820161406631,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.01346820161406631\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257803,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257803\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36983240223463687,\n \"acc_stderr\": 0.016145881256056215,\n \"acc_norm\": 0.36983240223463687,\n \"acc_norm_stderr\": 0.016145881256056215\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460852,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460852\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214963,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214963\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740546,\n \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740546\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.017193835812093897,\n \"mc2\": 0.5727980289823402,\n \"mc2_stderr\": 0.015500934892748477\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5208491281273692,\n \"acc_stderr\": 0.013760506094029866\n }\n}\n```", "repo_url": "https://huggingface.co/beberik/Nyxene-v1-11B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|arc:challenge|25_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|gsm8k|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hellaswag|10_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T01-32-44.134734.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["**/details_harness|winogrande|5_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T01-32-44.134734.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T01_32_44.134734", "path": ["results_2023-12-08T01-32-44.134734.parquet"]}, {"split": "latest", "path": ["results_2023-12-08T01-32-44.134734.parquet"]}]}]} | 2023-12-08T01:36:21+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of beberik/Nyxene-v1-11B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model beberik/Nyxene-v1-11B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-08T01:32:44.134734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of beberik/Nyxene-v1-11B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model beberik/Nyxene-v1-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T01:32:44.134734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of beberik/Nyxene-v1-11B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model beberik/Nyxene-v1-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-08T01:32:44.134734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
e50f425ce66812343e5297f51d10e9501b09d48c |
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-16B-v2.01
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/perlthoughts/Chupacabra-16B-v2.01
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-16B-v2.01](https://huggingface.co/perlthoughts/Chupacabra-16B-v2.01) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Chupacabra-16B-v2.01",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T17:22:21.064406](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-16B-v2.01/blob/main/results_2023-12-09T17-22-21.064406.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6291644824323459,
"acc_stderr": 0.03274488948991142,
"acc_norm": 0.6364297874777888,
"acc_norm_stderr": 0.033437033730375994,
"mc1": 0.4920440636474908,
"mc1_stderr": 0.01750128507455183,
"mc2": 0.6453446072596959,
"mc2_stderr": 0.015246087407331466
},
"harness|arc:challenge|25": {
"acc": 0.6296928327645052,
"acc_stderr": 0.01411129875167495,
"acc_norm": 0.6535836177474402,
"acc_norm_stderr": 0.013905011180063232
},
"harness|hellaswag|10": {
"acc": 0.6387173869747063,
"acc_stderr": 0.004793904922401889,
"acc_norm": 0.8292172873929496,
"acc_norm_stderr": 0.003755498941781851
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400352,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400352
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342856,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342856
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092437,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092437
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159267,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159267
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.014385525076611567,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.014385525076611567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677006,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677006
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.016421670506339185,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.016421670506339185
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195448,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195448
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4439374185136897,
"acc_stderr": 0.012689708167787686,
"acc_norm": 0.4439374185136897,
"acc_norm_stderr": 0.012689708167787686
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02767846864214472,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02767846864214472
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.0193733324207245,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.0193733324207245
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683903,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683903
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4920440636474908,
"mc1_stderr": 0.01750128507455183,
"mc2": 0.6453446072596959,
"mc2_stderr": 0.015246087407331466
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881578
},
"harness|gsm8k|5": {
"acc": 0.2532221379833207,
"acc_stderr": 0.011978125194299687
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_perlthoughts__Chupacabra-16B-v2.01 | [
"region:us"
]
| 2023-12-08T02:11:38+00:00 | {"pretty_name": "Evaluation run of perlthoughts/Chupacabra-16B-v2.01", "dataset_summary": "Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-16B-v2.01](https://huggingface.co/perlthoughts/Chupacabra-16B-v2.01) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Chupacabra-16B-v2.01\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T17:22:21.064406](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-16B-v2.01/blob/main/results_2023-12-09T17-22-21.064406.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6291644824323459,\n \"acc_stderr\": 0.03274488948991142,\n \"acc_norm\": 0.6364297874777888,\n \"acc_norm_stderr\": 0.033437033730375994,\n \"mc1\": 0.4920440636474908,\n \"mc1_stderr\": 0.01750128507455183,\n \"mc2\": 0.6453446072596959,\n \"mc2_stderr\": 0.015246087407331466\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6296928327645052,\n \"acc_stderr\": 0.01411129875167495,\n \"acc_norm\": 0.6535836177474402,\n \"acc_norm_stderr\": 0.013905011180063232\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6387173869747063,\n \"acc_stderr\": 0.004793904922401889,\n \"acc_norm\": 0.8292172873929496,\n \"acc_norm_stderr\": 0.003755498941781851\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400352,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400352\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479047,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479047\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342856,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342856\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092437,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092437\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159267,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159267\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n \"acc_stderr\": 0.014385525076611567,\n \"acc_norm\": 0.7969348659003831,\n \"acc_norm_stderr\": 0.014385525076611567\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677006,\n \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677006\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n \"acc_stderr\": 0.016421670506339185,\n \"acc_norm\": 0.40558659217877097,\n \"acc_norm_stderr\": 0.016421670506339185\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195448,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195448\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4439374185136897,\n \"acc_stderr\": 0.012689708167787686,\n \"acc_norm\": 0.4439374185136897,\n \"acc_norm_stderr\": 0.012689708167787686\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02767846864214472,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02767846864214472\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.0193733324207245,\n \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.0193733324207245\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683903,\n \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683903\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4920440636474908,\n \"mc1_stderr\": 0.01750128507455183,\n \"mc2\": 0.6453446072596959,\n \"mc2_stderr\": 0.015246087407331466\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881578\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2532221379833207,\n \"acc_stderr\": 0.011978125194299687\n }\n}\n```", "repo_url": "https://huggingface.co/perlthoughts/Chupacabra-16B-v2.01", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|arc:challenge|25_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|gsm8k|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hellaswag|10_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T02-08-47.844785.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-22-21.064406.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["**/details_harness|winogrande|5_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["**/details_harness|winogrande|5_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T17-22-21.064406.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_08T02_08_47.844785", "path": ["results_2023-12-08T02-08-47.844785.parquet"]}, {"split": "2023_12_09T17_22_21.064406", "path": ["results_2023-12-09T17-22-21.064406.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T17-22-21.064406.parquet"]}]}]} | 2023-12-09T17:25:15+00:00 | []
| []
| TAGS
#region-us
|
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-16B-v2.01
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model perlthoughts/Chupacabra-16B-v2.01 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T17:22:21.064406(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of perlthoughts/Chupacabra-16B-v2.01",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model perlthoughts/Chupacabra-16B-v2.01 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-09T17:22:21.064406(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
| [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of perlthoughts/Chupacabra-16B-v2.01",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model perlthoughts/Chupacabra-16B-v2.01 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-09T17:22:21.064406(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
]
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.